Beta/Staging for Techrights Wiki (Now 100% Static)
And why "old" can be beautiful
THE "old" wiki (technically it's just over 14 years old, so it is not really old except in relative terms) has been fully converted to static pages, but this is still work in progress. We're still fixing many things and accommodating with styles, redirections etc.
The full edit(ing) history is in the database, which we've kept aside as we move on to the "next generation" of this site.
As a mostly technical site, we'd like to take this opportunity to comment on our experiences and some of the lessons learned. First, as noted here a week ago, do not start a wiki unless you're fully aware of the maintenance work it'll take in the long run, not just moderating (e.g. removing spam) the thing but upgrading the thing, looking after a database (most wiki bundles use databases rather than flat files), and potentially dealing with an "end of life" (wiki software no longer maintained, i.e. no security patches and probably no compatibility with future versions of Python/PHP/other).
Wikis are generally not efficient in the delivery sense; even with a content delivery network or some caching layer (Squid, Vanish etc.) it won't be fast and it will become even more complex to operate, debug etc. To serve the same pages over and over again it may take a lot of CPU capacity and RAM, not just bandwidth. That leads to disk churn and - potentially - data loss too. When bots too are taken into account (a report we added to Daily Links yesterday said 50% of the Net is now bots), one should expect a lot of non-conventional page requests to be made (e.g. hundreds of page "diffs", for no good reason whatsoever), bypassing most caches, including a sophisticated content delivery network. This not only inflates hosting bills but can also cause a spike in the load, RAM usage, even system crash (an unintentional DDoS attack).
Moving to static, especially in sites that do not change much and do not involve many authors (or guest authors, unknown editors), makes sense.
Back in the 90s not many pages had cookies or served pages personalised to the visitor, based on some profile or a database with information for a logged in user. We just "surfed the Web", navigating our way through random pages and the only reason they took a while to load was our residential dial-up connection. There's no excuses for pages taking more than 1 second to load on 2023 grade broadband. Unless things have become way too bloated.
Notice below how fetching a wiki page from faraway Japan takes just 0.1 seconds now (our server is here in England, almost half the globe away). █