https://wiki.secondlife.com/wiki
DimplesApplePie Spicy
Subject: Persistent 502 Bad Gateway Issues on Second Life Wiki
Dear Second Life Wiki Team,
I am writing to express my concern regarding the repeated 502 Bad Gateway errors encountered on your website (https://wiki.secondlife.com/wiki). These persistent issues have been ongoing for about a month now and are significantly impacting usability. Could you please provide an estimated timeline for resolution? The current situation is becoming increasingly difficult to manage.
Additionally, I would like to inquire about the possibility of downloading the entire wiki to host it locally on my Mac. This would help avoid such interruptions and improve my workflow considerably. Could you advise on whether this option is available and, if so, how to proceed?
Lastly, if the current hosting infrastructure is a limiting factor, I would recommend considering migration to a more reliable provider, such as Hostinger, to enhance stability.
Thank you for your understanding and support. I look forward to your prompt response.
Log In
Gwyneth Llewelyn
As far as I can see,
it's back
, but there are still some persisting issues (some built-in macros, such as syntax highlighting LSL code, aren't working). Regarding the built-in LSL editor and its own LSL Help, the problem is that the viewer still has http://
links but the current configuration mandates https://
links — this is not an issue with the Wiki itself but with the viewer (I only tested the latest SL Viewer; other TPVs might work differently).As for the rest of your requests, I share your pain, but....
- You can grab the whole (or almost the whole) SL Wiki using a terminal-based command like wget. Obviously, this would be just astaticversion of the wiki, taken at a certain moment in time, and would probably not get youeverything. Also, it would take a considerable amount of time if you want itall— the SL Wiki has over 45,000 pages if you want itall(not including version changes — because then you'd need to download almost 300,000 entries), plus over 8,500 media attachments (mostly images).
The good news is that you might get away with 'only' 10,000+ pages, if you aren't interested in talk pages, user pages, category and template pages, and other minor trivial ones. Crafting the
correct
wget
command to get what you want (and exclude the rest) isn't trivial
, but feasible.- You can see a pretty recent and up-to-date version on the Wayback Machine. More recent versions, however, seem to suffer from the same problem 😢
- It's unlikely that LL will give you their MediaWiki backups, and in any case, that would require considerable experience to set it up and flawlessly mirror its configuration on your own Mac — unless you're very, very proficient installing & configuring MediaWiki, which is not the easiestof web-based tools out there (it just happens to be thebestwiki software...).
- Even if you had ways of persuading LL to give you their backups, you'd need one every day (at least), since there is always ongoing tinkering with the wiki (the same applies to using wget, of course, but at least you could just add a periodiclaunchdentry to do that for you in the background —wgetis clever enough to figure out what has changed and just retrieve those pages. Well, most of the time).
Gwyneth Llewelyn
- Last but not least, and with due respect to the Lithuanian entrepreneurs who set Hostinger up and keep it running for 20+ years, they're really no match for what Linden Lab is using, which is a mix of Akamai/Edgesuite and Amazon Web Services. Rest assured, both have plenty of capacity and power already a substantial part of the whole world 😂 I had never heard of Hostinger before, and I'm glad they still thrive in their market and have expanded their operations to 10 data centres world-wide, which is a good start, but... really, there is no comparison.
Obviously, you can use the best, fastest, most reliable hosting infrastructure in the world, but that is next to worthless if the
software
is broken in the first place! 😅