39

Stack Overflow depends heavily on moderation by a body of reliable users dedicated to the site rather than merely gaining points and badges any way possible. It's vital to be able to tell these groups apart from each other.

Over time, reputation is going to become less and less reliable a way to do that. The voting system is easy to game, in many ways and shades. There are contributors rising "through the ranks" whose main motivation is not creating a high quality resource. They just post mediocre stuff in popular tags. Eventually, they will reach hundreds of thousands of rep.

The recent problems with the review queue show that there should be a better way of telling apart users genuinely concerned with quality from those just hanging out to earn rep and badges.

Now, machines and algorithms can always be gamed. The best detectors of dedicated users who are "for real" would arguably be - you and me! The dedicated, veteran users. In our everyday interactions on SO and maybe Meta, we come across fellow users and, over time, reach some judgement of them and their actions on the site. I could probably name fifty users right away who I would absolutely trust using the review queue. I could also find fifty users who I would absolutely not trust using the review queue.

Currently, there is no way for me to explicitly tell the system that I trust or distrust these users. Voting doesn't work for this: I may not be active in the same tags as them, or they may not be answering much on SO in the first place, but be busy editing and commenting. Or the bulk of their activity will be on Meta. Or I will know them personally from conferences and such, rather than the site itself. And one vote is only a drop in the bike-shed-ridden bucket anyway.

Should SO create a new, mostly invisible, social layer of trust? A way for a user to tell the system whether they trust a user or not. Obviously implemented in a way that doesn't break the basic SO idea of "it's about the content, not the users".

There are many ways to do this. Most are horribly flawed. There must absolutely not be any kind of public showing off of "trust rates" or something.

One way that might suck slightly less than all the others is a simple yes/no system.

A user would become a trusted user when they receive enough votes of confidence, let's call them "trust votes", from other trusted users. A number of trust votes is required to make you a trusted user, depending on how trusted the voters are. A moderator trust vote could make you trusted immediately. The algorithms used to calculate trustworthiness, and the threshold at which you yourself become trusted, would be secret. Votes of confidence would remain anonymous, so you never know how many people voted for you and who.

Once a user becomes a trusted user, the words "trusted user" would show up in their profile; no other public recognition is given.

picture of a profile, with the words "trusted user" next to the username

The trusted user status gives them access to certain moderation tools that are closed to everyone else, even 20k+ users. (To me, the review queue should be one of them.) New moderation tools could become "trusted only" instead of 5k+, 10k+, 20k+ tools. (Migrating some existing privileges to "trusted only" would probably be too unfriendly an act to those who'd stand to lose it that way.)

Users would have a "I trust this person" / "I do not trust this person" button when they visit another user's profile. A trust vote is an important thing and people would have to be encouraged to use their votes wisely. Maybe through a total limit that you have to manage carefully, and that grows very slowly (say, five new votes a month).

As to how to get this whole thing started: at first, give, say, 50 trust votes to a limited circle. The ideal circle would be the people who have undergone the most scrutiny by the community, and are best equipped to have a good impression of users' activities: the moderators.

Let it spread from there, like a ponzi scheme, moderators trust-voting users, and those again trust-voting others - although maybe with less votes than the initial allotment that the moderators get. This would my favourite method of starting it because the feature would remain very low-key this way.

To reiterate, this is as non-social as can be. There is no public display of how many people trust you. You never know how many people trust or distrust you, and who they are. All that is visible to you and the public is the info whether you are a trusted user.

Making sure that only other trusted users can make you a trusted user help prevent "awesome, this user gave me the codez so I vote to trust him in return" situations. This suggestion is explicitly designed to battle bike shed voting by limiting the number of people who can vote.

Things that would have to be addressed:

  • Users from tags with little traffic might go unrecognized, as there are fewer people to observe and appreciate their activity.

  • There would have to be the possibility to withdraw "trusted user" status. Most likely if you gain a lot of negative trust votes. This would have to be dealt with carefully.

32
  • 12
    Ha, we were just whining and moaning about this in our mod chat room an hour ago. On a site like SO, reputation is no longer a reliable measure of how much the community trusts you in terms of community moderation, but merely in terms of knowledge in the subject matter (i.e. programming). Commented Nov 3, 2012 at 10:34
  • 1
    @Ben my thinking is that the existence of the feature itself would be enough to encourage people to trust-vote a lot, and the main challenge would be making sure they don't vote too much. It's like a tiny moderator election that runs every day - I'm not worried about attendance. One would have to see how it works out in the real world, though, of course.
    – Pekka
    Commented Nov 3, 2012 at 10:34
  • 12
    @Bolt yeah, and sometimes not even that. You can easily "get rich" by posting 20 cheap jQuery or PHP snippets a day.
    – Pekka
    Commented Nov 3, 2012 at 10:35
  • 2
    Can we expect users to behave/think differently when using this separate voting system?
    – user165950
    Commented Nov 3, 2012 at 10:42
  • 2
    @Pekka This isn't the first time I'm getting the feeling you somehow have found a way to spy on us in Teacher's Lounge ;P
    – yannis
    Commented Nov 3, 2012 at 10:43
  • 7
    One more objection before I upvote :-). I would be worried that the trusted user group quickly becomes a closed circle. Say, no pressure :-), I vote for you and you vote for me. Would it make sense to give trusted users an additional vote a month, say? Commented Nov 3, 2012 at 10:52
  • 3
    +1 for the question, but I don't know how I feel about "trusted" as a boolean value
    – Jeroen
    Commented Nov 3, 2012 at 11:10
  • 1
    What did you expect @Pekka, unicorns and waffles :-). I don't believe that Meta has corrupted you that much! Commented Nov 3, 2012 at 11:50
  • 7
    No, I meant the display of Trusted User in your screenshot on the user profile page might be interpreted incorrectly by some. If someone notices that a user is trusted, they might assume that everything that they post in a particular tag is absolutely correct. We need a different word. I hope that I am not confusing.
    – user162697
    Commented Nov 3, 2012 at 12:31
  • 8
    @Pekka Make it say "Trusted Janitor". Commented Nov 3, 2012 at 12:40
  • 2
    But still you could have sock puppets voting you for trusted user/janitor :P to get that additional vote/privilege.
    – GoodSp33d
    Commented Nov 3, 2012 at 13:23
  • 2
    @2-Stroker: No, you need to be a trusted user yourself, in order to vote for others being trusted. Commented Nov 3, 2012 at 13:35
  • 2
    -1 This looks like an overreaction to the problems introduced by the review queue. Creating an elitist ponzi-scheme system like this one might solve the review queue problems, but it will further aggravate other problems. It is more trouble than it's worth. Commented Nov 3, 2012 at 18:12
  • 1
    @Null it's not really a reaction to those problems specifically - the issue of trust not being equal to high rep is there anyway, and is likely to become more and more relevant. I actually had the idea for this a year or two ago.
    – Pekka
    Commented Nov 3, 2012 at 18:17
  • 7
    @Pekka One immediate problem I see with this is that it will create an unfairly privileged group: one that will have the same tendencies and ideas when it comes to closing and deleting questions. It will be hard for someone who does indeed care about the site, but just thinks differently (ie: an inclusionist vs. a majority of exclusionists) to become a "trusted" user. This will further tip the scale in favor of one group, and promote more divisiveness in the community. Commented Nov 3, 2012 at 18:43

8 Answers 8

21

Once upon a time, votes for questions and votes for answers received the same ten points. Time passed, and an ever-growing number of people were reaching important rep thresholds based on questions -- and, to make it worse, often they were crowd- (or Fiddler-) pleasing fun or fluff.

Two responses ensued: less reputation for question votes, and clearer guidelines encouraging closure and deletion of fluff.

However, rep, like entropy, is ever-increasing. Even with less points for a question, a user who just posts a lot of fair-to-middling questions will eventually reach any threshold.

This question, however, challenges a more basic assumption. A core theory of this site was that editorial privileges should be gated by expertise, as measured by rep. Give or take the edit review system, which only functions at the low end, janatorial efforts are rewarded entirely by badges, while janatorial privileges are awarded based on content.

Changing this would be radical, in the original, root-vegetable, sense of that word. I'm afraid that this proposal would add a confusing mechanism that might not, in the end, fix things.

I submit that 'trust-votes' are, essentially, a continuous, larger-scale, moderator election. As such, this proposal should be considered, pondered, and perhaps modified in the light of expanding the moderator system.

The problem I see is that the actions that should feed an evaluation of granting privileges are not very visible. What users can see of my actions are my answers, comments, and edits, and close votes. They can't see my flags or plain old votes.

Ideally, the system would observe my proposals for actions (e.g. flags), and grant me autonomy based on whether my proposals meet with the approval of the mods.

In other words, instead of asking users to trust me based on questionably relevant evidence, the system should decide to trust me by measuring entirely relevant metrics. If my flags are nearly always right, I should be trusted more to do these things. I don't claim to have a completely worked-out proposal here, but I submit the general idea for consideration.

3
  • 5
    Very fair points. I agree that this would amount to the election of thousands of junior moderators. Approaching this from the other end (ie. changing the mod system) might indeed make more sense
    – Pekka
    Commented Nov 3, 2012 at 15:50
  • 2
    well, the core issue is that we are measuring two things: 1. skill at programming (or {site topic}, say photography) 2. skill at moderating -- but we are granting incremental moderator abilities solely based on #1 and not at all based on #2, which mostly works but isn't really correct in the big scheme of things. Commented Nov 6, 2012 at 21:31
  • @JeffAtwood indeed that is a concise restatement of my point.
    – Rosinante
    Commented Nov 6, 2012 at 21:35
41

I began noticing some very odd voting shortly after the new review system was implemented. Normally, when I see a completely craptastic post with several up votes, I run straight to our tools to see if I notice any suspicious patterns. After chasing half a dozen geese, I realized that it's just people blindly voting in review.

There's also more users that write .. less than awesome posts now reviewing .. less than awesome posts. So it's not always blind gamification, ignorance is partially to blame.

What remains is, those that can maintain high quality and do participate in review flag this stuff, and we take action. So a partial solution, or perhaps the means of getting data to come up with a solution is this:

Track the amount of negative actions a moderator or trusted (20k) user took on a post any user positively reviewed, by the user(s) that reviewed it.

This should quickly be able to tell us who is doing it wrong, which leads to a better idea of what (if anything) could be done about it. I really want better data on the problem before anything is done to correct it.

9
  • 12
    Not sure if it's related to the new review system, but recently I've seen several bad questions with upvotes. And with bad I mean that they're incomprehensible/unanswerable. Commented Nov 3, 2012 at 11:07
  • 10
    @CodesInChaos yup, I'm fairly sure that is 100% tied to the review system.
    – Pekka
    Commented Nov 3, 2012 at 11:08
  • 2
    This is probably the consequence of including voting as possible action that removes a post from a review queue. As users feel they are doing a good job by limiting the grown of the review queues, some of them will accept voting as needed action, even if the post doesn't really deserve any up-vote.
    – avpaderno
    Commented Nov 3, 2012 at 12:49
  • 2
    Another great example of the bizarre voting that review badge gaming has caused is this link-only answer promoting a user's blog: stackoverflow.com/questions/7498357/… that was upvoted 4 times in under a minute this morning. If moderators had better metrics on who was possibly reviewing poorly (upvoted posts that were later deleted, accept votes that run contrary to three reject votes on a suggested edit, etc.), we could look over those reviews ourselves and maybe nudge people in the right direction. Commented Nov 3, 2012 at 16:26
  • 1
    @Brad nudge... or slap :) I'm starting to think you could post the first few paragraphs of Das Kapital as a question on Stack Overflow, and under the right conditions it could get a couple of upvotes.
    – Pekka
    Commented Nov 3, 2012 at 18:50
  • I considered posting about 'one second reviews' or 'who reviews the reviewers' but the issue is basically covered here. One idea is to have the option to veto a review. If a review gets N vetos (number of vetos already cast will not be shown) the review won't count. If a user gets many reviews veto'd, s/he could be temporarily banned from reviewing. This may encourage some to actually review posts instead of just up-voting through them for whatever reason. Commented Nov 3, 2012 at 19:51
  • 3
    Well, part of the problem with the review queue is it forces you to upvote or downvote or do something, when quite frankly, not every post needs something be done with it. So perhaps a lot of people are upvoting as a default measure?
    – jmort253
    Commented Nov 3, 2012 at 22:09
  • Of course. The issue i am referring to is upvoting blindly, as in processing several reviews in just a few seconds. If there was an option to just accept as-is, certain ppl would use that instead of upvoting, without actually reviewing the post. Possibly to get badges or because the post looks 'well formatted' when you look at it for only a second. I don't know if this happens frequently and just goes unnoticed except for extreme cases. Also there are many different opinions on what's considered proper reviewing. Commented Nov 3, 2012 at 23:21
  • @jmort253: exactly the case. Commented Nov 4, 2012 at 15:53
10

The idea is interesting, but the thing is, people actually vote on posts. People spend a great deal of time going to posts and they generally remember to upvote them if they're good; it's a pretty natural interaction on the web. Going to people's profiles after getting certain magic votes and giving special magic person votes? Let's sanity check that quick.

People don't visit profile pages often. (Limiting factor 1) Profile pages are really busy, so a new button would be hard to notice (limiting factor 2) and some major education stuff would be needed (limiting factor 3). To boot, only "trusted users" are allowed to vote (limiting factor 4). If there's only so many trusted users, it's really easy for the system to break down if just a few people don't vote. And what if we're talking about Stack Overflow? Suddenly tag partitions are a big problem; trusted users like John Skeet and Eric Lippert might be great at identifying Trusted users in C# (if they care), but they're unlikely to see trusted users in tags they don't use, or worse, ignore (limiting factor 5).

The suggested method of starting this off (moderators voting) sounds bad too, in no small part because of the potential for drama. Suddenly I, as a moderator, am supposed to decide who gets privileges on the site. I'm sure you see the potential problem there.

On top of all of the things that drastically limit the likelyhood anyone will actually vote here, suddenly we're voting on people, not posts. With the exception of moderator elections, that's a pretty big paradigm shift.

The problem really shouldn't be judging the quality of users directly and subjectively like this. If anything a user's actions should be judged. If a user consistently approves edits that are rejected or makes edits that are rolled back, maybe they should get a notification or maybe even a brief approval/edit ban. But the whole thing where an extremely limited set of people are expected to vote on an extremely limited set of people via an extremely rare action just seems like an extremely good way to make sure we just plain don't have anyone that can actually use these tools.

If there's a problem with how people use a tool, address that. That's why we have edit bans, require multiple approvals, multiple close votes, etc. Just making a system so that virtually no one can ever use it just makes tools useless and forces the burden back on moderators who are already spread thin. The whole point of community trust and community moderation abilities is to lighten the load of moderators and make moderation a thing everyone can take part of, not just the 0.0001% of people who actually recieved or cast these new magic votes.

1
  • 2
    I agree. In addition, I think we'll find that no matter what the rules are, there will be people who will game the system to appear better that others (higher rep, getting trusted, etc.). What's to keep some folks from making their friends trusted users regardless of their actual skills or dedication to the site? Commented Nov 3, 2012 at 23:01
4

I like the idea. As jokerdino says there ought to be some mitigating factors but I would like to address the potential problems. I think they can be, at least partially, solved by the bit you edited out about allowing everyone to vote and what remains, i.e. starting with a trusted circle.

To avoid the "circle jerk" phenomenon I would start, as you suggest, with the moderators. Give each moderator unlimited votes. Once a moderator has nominated you, you're automatically "trusted".

Every trusted user then gets a limited number of votes. For example, 12 to start and then one a month. The point of this is for it to be a low number. People should be jealous of their votes so they're less likely to use them unwisely. It would make sense for the first batch to have more votes to spread the love around a little more.

It takes X votes from trusted users in order for yourself to become trusted, say 2.

Then to up the chances of ensuring that all tags are covered you allow everyone to vote. To stop everyone becoming trusted you still have to be nominated by one trusted user. You also need Y votes from other users. In other words you become trusted if you have the following:

  • Y votes from "ordinary" users and one vote from a trusted user.
  • OR X votes from trusted users.
  • OR 1 vote from a moderator.

There could be a little list somewhere of people who need a single vote from a trusted user in order to become "trusted".

A trusted user would also get 1 "untrust" user vote each month. It would require another Z votes from trusted users for someone to become no longer trusted. Moderators would be included in this but would have unlimited votes (no accusations of revenge attacks).

Where I would differ from you is that in time, I think it would be worth applying this to the review queues (in the future). I'm not saying that you have to be a trusted user to use them necessarily but that you would have to have been voted as a trusted user by a few not yet trusted users in order to use them.

4
  • 1
    Perfect, that's exactly how I envisioned how the process could work, but spelled out much clearer. Thanks! One exception: I am in favour of leaving the general public completely out of the equation (ie. only moderators and trusted voters can trust vote), to avoid the thing descending into a popularity contest, and the bike shed effect that normal voting suffers from.
    – Pekka
    Commented Nov 3, 2012 at 12:54
  • When moderators don't have limits in doing something, they are also supposed to use that possibility when strictly necessary. Moderators don't have limits on the number of closed questions, but they are supposed to vote to close when strictly necessary (e.g. blatantly off-topic, not constructive, not a real question, or too localized questions). As moderator, I much prefer having something I can do as normal user; I prefer keeping my 40 daily votes, rather than being called out because I voted 100 posts.
    – avpaderno
    Commented Nov 3, 2012 at 12:56
  • Hmmm, yes @pekka. Without knowledge of how close someone is to being trusted and with the additional restriction that no matter how much you vote for them it'll make no difference without a trusted user coming along and the fact that you have limited votes I think including the general public might be okay. There's enough restrictions that it's impossible to do damage. Commented Nov 3, 2012 at 12:56
  • Fair enough @kiamlaluno, it's something that could be restricted after a while. On the larger sites I think that it's something that would need to spread a bit before it's useful but once that's happened mods could easily be restricted as ordinary users are without it hurting the site. Commented Nov 3, 2012 at 12:58
4

Personally, I think this idea on a whole is misguided.

The purpose of reputation is to fairly and dispassionately measure "trust" for how SE defines it. With this system, we might as well scrap reputation entirely. And it sets up an awful precedent: if certain people don't like who is at the top of the proverbial "trust" meter, scrap it until we come up with a system that matches those people's truthy idea of who should be trusted.

But I wanted to highlight one part of your proposal:

As to how to get this whole thing started: at first, give, say, 50 trust votes to a limited circle. The ideal circle would be the people who have undergone the most scrutiny by the community, and are best equipped to have a good impression of users' activities: the moderators.

One of the great benefits of Stack Exchange and how it even bills itself is that the community runs the sites, not them. Why would you cede this amount of power to a set of people who are elected for life?

For that matter, any time the moderators are asked to do a little more, they push back saying that they don't have time or that they're overworked or that they're just volunteers. Why would any moderator agree to this plan? Having to be solely responsible for deciding who is trusted on a site? If a moderator is overworked or has better things to do, this seems like a real hassle.

I get that you don't like how the review queue is going; I really do. But effectively scrapping the reputation system and ceding a huge amount of power to a select group of people is not the solution.

Rather, the problems you describe can easily be attributed to two things:

  • Tools that could use some improvement, because it's too easy to do bad stuff and too difficult to undo aforementioned bad stuff
  • A largely disinterested user base who haven't, as yet, found a reason to take the review queues as seriously as people like you do

Instead of writing off the tools and creating a far smaller pool of users helping with community moderation, I'd much rather see the tools improve and ideas on how to get the community more engaged.

5
  • Scrap the reputation system? Holy cow, no. Just limit access to some review and (in the future) other tools, and maybe add small things here and here. My point is, rep as a measure of trust is broken. Being a 20k user no longer necessarily means you are trusted by the community. Even blankman is approaching 20k, and he hasn't managed to learn the basics of how the site works in 4 years. Not that he'd likely be interested in using the tools, but.... Re seeding, Mods would really be the ideal starting point IMO. It ...
    – Pekka
    Commented Nov 3, 2012 at 22:05
  • ... wouldn't mean they have to appoint every trusted user themselves. All that said, I can't really disagree with your last sentence!
    – Pekka
    Commented Nov 3, 2012 at 22:06
  • 3
    @Pekka The only practical purpose of the reputation system is to measure trust; to dole out privileges based on the only thing that truly matters on a Q&A site: the ability to post quality content. By removing that purpose and giving it instead to a small group people to subjectively decide who gets access, there's no point to having reputation anymore. But I don't believe SE has become too complex to be managed by self-rule and needs an elite group of people to manage it for us.
    – user149432
    Commented Nov 3, 2012 at 22:13
  • @pekka I agree this is a misguided effort. See my comment on Rosinante's answer here which is quite good. Commented Nov 6, 2012 at 21:32
  • @Jeff fair enough. Can you give it a status-declined then? I'LL BE KING OF THE TAG YET!!!!
    – Pekka
    Commented Nov 7, 2012 at 7:54
3

Trusted users are probably, and hopefully folk who have put in a certain degree of community involvement. Considering that one of the things that @Pekka is proposing for the "trusted user" status is having access to the review queue, could I propose that one of the criteria for this would be a reasonable number of helpful flags (say 50), and at least a 70-80% ratio of helpful to non helpful flags - this means the user takes the effort to flag (and 50 isn't that hard).

And while using mods to 'seed' trusted users is a good idea, I think allowing a mod veto/revocation of the trusted user status would be a good balance against it. A large number of negative flags (of unknown amount) by distinct users might be grounds for a review of the status (automating it somewhat).

What I find problematic is the blackbox nature of this. While it prevents the system from being gamed, it feels someone against the whole open nature of the site. Its something that needs to be balanced pretty well.

2
  • 1
    Much like flags are anonymous, I don't see a problem with this system being anonymous based. Because this is a question of trust, the algorithm shouldn't be revealed. Commented Nov 3, 2012 at 13:00
  • 4
    Independently from this suggestion, The "valid flags" requirement could be an immediate remedy for the /review brokenness.
    – Pekka
    Commented Nov 3, 2012 at 13:39
1

I would like to propose one alternate solution how about this instead of trusted user ?

Keep the review system, allow a user to review a post if he has earned a silver/gold badge related to a question. For ex: to review a C tagged question I need to have a C Silver badge. That way not everybody can review it, it puts additional restriction.

PS: Do let me know reason if you choose to downvote.

6
  • 6
    I'm downvoting because: 1) there are some tags where no one has a bronze badge let alone a sliver and 2) Just because I don't have a C sliver badge doesn't mean that I can't tell whether a post is complete crap or not. Not knowing C may inhibit my ability to know how good it is or whether it's a duplicate; it doesn't stop me from voting to close a "give me teh codez" question. My downvote indicates disagreement not that your answer is "bad" in some nebulous way :-). Commented Nov 3, 2012 at 13:40
  • 3
    Welcome to meta! Please don't be put off. It's great fun here sometimes! Commented Nov 3, 2012 at 13:41
  • Plus, I think this would seriously limit who can actually review. I've spent a lot of time on SO, and I know what a good and bad post looks like, and I don't have any silver badges in tags. Not sure this would really solve anything.
    – jmort253
    Commented Nov 3, 2012 at 22:35
  • 1
    @Ben: So tweak the suggestion a bit. If there's less than 5 with such a badge then let anyone review. I like the idea. Commented Nov 3, 2012 at 22:58
  • @Chris, the problem with having to tweak a suggestion is that you'll have to tweak forevermore. What happens if those 5 are no longer active? On holiday? Having a break from the site in question? Do I have to alter the tags on a question to spam flag it? Might other people not do that? Commented Nov 3, 2012 at 23:32
  • Because I don't know the language I'm a lot more careful about every action I take on C posts. I rarely review or edit them and mostly skip them in the close votes queue if I'm in doubt. I spend about 20 times as long on an action if I'm doing something because I want to be certain. I'd rather have someone do that on the tags I'm active and help keep the site clean than them be banned. What is annoying is people without a clue doing harm because they can click a button and get out of there quick. Pekka's suggestion is meant to address this. I do not see how this answer does. Commented Nov 3, 2012 at 23:33
1

I really like the idea behind the suggestion but I would like to propose some minor alterations, which I guess would make the idea a teeny bit better.

  • The trustees should not have a permanent seat like the moderators. This would help us control the damage in case any user goes rogue and undetected. Each user may have a term of 30 days or so with any significant lapses of judgment costing their position.

  • Going with your original idea of allocating votes to already trusted users, any user who accumulates a certain number of anti-trust votes cannot be appointed as a trusted user for, say 6 months or so. If someone is not trusted by quite a number of people, that should be a good enough indicator.

  • Given subtle hints that the community team is overwhelmed with monitoring all 300+ community moderators across the network, it might be wise not to grant insta-votes to trusted users. Instead two votes from any trusted user can be considered final, which would make the body of trusted users self-regulatory and a safer solution.

5
  • 3
    This is unworkable... the moderators have enough to do - and their own lives ( I assume :-) ), without having to elect several hundred/thousand users. Commented Nov 3, 2012 at 12:02
  • Right~ Poor moderators have a life too. I overlooked that bit. I'll update my post.
    – jokerdino
    Commented Nov 3, 2012 at 12:04
  • I must say I'm not that worried about circle-jerks - you could say they kind of are the point of the suggestion. I know user X on Meta; I'll trust-vote them, they will trust-vote me. That is intended and would be harmful only if trust-votes would go to harmful users, and I don't see that happening. I don't think all the votes would be distributed only among the Meta posse. What you suggest is a good idea for a smaller site, but I too think it would put too much load on mods on Stack Overflow. Massive numbers of trusted users will be needed.
    – Pekka
    Commented Nov 3, 2012 at 12:26
  • 1
    @Pekka, I just removed that chunk because well, the users are supposedly trusted. Hopefully, they will act with the best intentions in mind if and when this feature request ever gets implemented.
    – jokerdino
    Commented Nov 3, 2012 at 12:30
  • 4
    (And I'll add this to the list of Pekka's feature-requests that will never get implemented.)
    – jokerdino
    Commented Nov 3, 2012 at 12:32

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .