33

This is not a duplicate of Keeping closed sites in suspended animation: a modest proposal because the main point of that is to allow for a process in Area 51 to "reopen" a closed site, while this one merely asks for closed sites to remain alive and read-only, without a special procedure for "reopening".

When a beta site is closed, the questions that can be migrated are given to other sites. But what about the questions that aren't relevant on other sites?

Let's say you have a question, that was fairly high voted, and has an accepted answer. That question may be useful to someone else in the future, even if it will not be useful to as many people as a StackExchange site requires. If that question is deleted, than in the future, people who are searching for answers on a question that happened to be covered by a closed beta, will not be able to get it.

What I propose, is that when a beta site dies, instead of deleting it, it should be turned into an "archive site".

The archive site would basically be the same as the regular site, except there would be a big "Archive" at the top of the site (it would look just like the meta site, except with "archive" instead of "meta", and it would be colored even more gray), and it would be read-only. People would be able to view the questions and answers, but no one would be able add anything, or vote.

Also, I would suggest that Stack Exchange keep track of the traffic that comes into that site. If, for some reason, the topic of that dead beta become of interest to enough people, the site could be recreated.

Any questions, comments, suggestions?

5
  • 8
    Along with a cobwebs-and-skeletons theme!
    – Flimzy
    Commented Jun 13, 2012 at 21:36
  • 1
    As this one has an answer from SE I don't think it should be closed as a duplicate of a question that doesn't @EnergyNumbers. Commented Dec 14, 2013 at 12:25
  • why was this marked as a duplicate of a question that was asked a month after this one?...
    – Ephraim
    Commented Mar 17, 2014 at 1:01
  • 1
    The other question is asking about making sites possible to reopen in the future. While there is a fair amount of overlap, the intent of this question is different. I have cast a reopen vote.
    – hat
    Commented Dec 14, 2018 at 7:22
  • 1
    I personally lost a lot of valuable content that I've contributed to failed betas, and I'm not the only one. Deleting valuable content is really irritating, and the counter-arguments I've seen don't hold much water. Commented Jan 18, 2021 at 12:45

4 Answers 4

19

I have to respectfully disagree with the arguments here.

The whole point of Google algorithm is to rank active and useful content higher than useless and low quality. So it is impossible that the site remains both dead and high in Google ranking at the same time. Either the site remains dead and people hardly find it on Google, or it remains useful and thus does make internet better.

I also cannot support the concern about "Sorry, that information can no longer be updated.":

  • 99% of internet cannot be updated by users, yet they have no problem with that.

  • It can be clearly and honestly stated that, in order to make the site live, people can go to Area 51 and push for it.

If the SE is really concerned about its brand "getting hurt", it can easily make the site look completely different. Even redirect to another domain, change colours, style, if that matters.

What makes internet really worse place is creating 404 dead pages. People who link to this site work hard on providing quality content to their readers and breaking their links is simply unfair to them and breaks their faith.

At least, it would be fair to state clearly and prominently on every page that the content can disappear any time, if that is the case.

In contrast, keeping links intact will give people faith in site's stability and support of their work, motivating them to work on site's revival, which is precisely what is needed for making internet better place, in my opinion.

On other hand, making the content hard to access will only turn people off. If fact, the more generous and active they were, the harder they worked for the site, the more turned off, disappointed, feeling betrayed they become. They will look for other forums and find them. So who remains then? Less active people who don't care and the majority who aren't even aware the site ever existed. I am sad to say it, but I just don't see how this can make internet better.


EDIT. Upon request by Dan Dascalescu adding his comment:

Another aspect is that users are now well-used to archived content. Reddit archives threads older than 6 months, and the vast majority of content on Reddit is exactly like that; archived, impossible to modify, but not deleted - because it's very useful, and the downside of being unable to upvote that Reddit comment, it far smaller than the upside of being able to read it in the first place.

3
  • 5
    this is spot on in my opinion Commented May 18, 2020 at 19:38
  • 5
    Excellent answer. Can you please incorporate the Reddit argument and anything else from my comments on this other negative answer? Commented Jan 18, 2021 at 12:44
  • 2
    "the more turned off, disappointed, feeling betrayed they become" - exactly how I felt when yet another site was taken down, deleting all the content I contributed to it. This other user too. Deleting these sites is a sort of digital beheading of upstanding citizens who have committed no crime. Commented Jan 18, 2021 at 12:47
15

There are some substantial unintended side-effects that would make this somewhat undesirable.

It doesn't make the Internet better

We have a very ambitious company goal: "Make the Internet a better place to get expert answers to your questions." Even the content worth saving has pretty much peaked at the time the site was closed. It will never be updated or improved, ever. Users trying to correct the information will be met with "Sorry, that information can no longer be updated." Users seeking help will be met with "Sorry, this site is closed." A wasteland ghost town does not make for a pleasant end-user experience.

It hurts the network

Long after the site has been closed, remember, Google search will continue drawing people into this outdated legacy content… and those potential users aren't going to understand the history and nuance of an "archived" site. Their first experience will be a sad little place where content hasn't been added or updated in recent memory. The apparently neglect will detract irreparably from the sites that work hard to keep their content of high quality and up to date. A bad first impression is really hard to recover from.

The Poison Pill

Once you lock in a failed site, that's a pretty big head-on-a-stick gesture for future audiences. Let's say someone wants to start an AI site again — it will happen; our network is only getting bigger — that's a pretty dire legacy to overcome. "That didn't work and we could never do that again." Don't lock in failure.

10
  • 2
    Why not marking this as status-declined then? Commented Oct 11, 2012 at 13:49
  • 8
    @ShaWizDowArd I'm just adding my thoughts to the conversation. I'm not issuing a decree... and the conversation has only been here for 15 minutes. Commented Oct 11, 2012 at 13:53
  • Fair enough, probably too used to moderators cast a final vote on such suggestions and either accept or decline them. And it's here for 5 months. :) Commented Oct 11, 2012 at 14:19
  • 2
    @ShaWizDowArd The idea itself has merit. I just believe the downsides far outweigh the benefit. It was Area 51 for 5 months, but only moved here for discussion (as of now) 50 minutes ago. Commented Oct 11, 2012 at 14:28
  • Oh, my bad.. didn't notice it was migrated. Thanks for explaining! Commented Oct 11, 2012 at 14:30
  • 2
    I begrudgingly agree with this; I'd like for some of that information to be much more easily accessible for those of us who are looking for it, but it's just not pleasant for people who won't understand the whole situation, and the content will be stuck in disrepair.
    – Zelda
    Commented Oct 11, 2012 at 14:50
  • what if the "archive-sites" were only accessible to users who had above a certain reputation on one of the SE sites? I guess it wouldn't really solve the "head-on-a-stick" problem, but it would fix the problem of new users seeing the site through google search.
    – Ephraim
    Commented Oct 11, 2012 at 19:08
  • 2
    @Ephraim There's still a trust issue of providing unmaintained content... and users following links leading into impassable dead-end walls. At some point, it just seems like a bad idea and time to move on. Commented Oct 11, 2012 at 19:19
  • 2
    As of 2021, this answer is wrong on many levels. Most are addressed by Dmitry's answer. Another aspect is that users are now well-used to archived content. Reddit archives threads older than 6 months, and the vast majority of content on Reddit is exactly like that; archived, impossible to modify, but not deleted - because it's very useful, and the downside of being unable to upvote that Reddit comment, it far smaller than the upside of being able to read it in the first place. Commented Jan 18, 2021 at 12:42
  • Have you considered archiving the site under a different domain name then? No brands will be hurt if you serve a static copy of example.stackexchange.com under example.failedexchange.com Commented Nov 23, 2022 at 8:07
12

This would be excellent. While closed off betas already have a data dump, it would be much easier if we could still use the SE engine for accessing it. Specifically, searching and navigating through the data would be much easier through the SE engine rather than XML files from the data dump.

1
  • 7
    A simple viewing engine for the archived site zip file would be useful. A bit surprised this doesn't exist yet.
    – occulus
    Commented Apr 1, 2013 at 17:50
3

If you are not after the official, reference-quality resource, you can do this yourself. I wrote a tiny Lua/shell script to regenerate the meta for the failed Freelancers site:

https://github.com/chalst/sxdatadump2lua

giving the output seen at

http://www.textproof.com/ar/freelancers-meta/

I like the approach in my code, but since other hackers probably will not, I note that it's pretty easy to put together your own code. Remember the data dumps have a licensing requirement to link back to the site.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .