190

While the timing of this post coincides with us expressing some serious concerns around how we're not doing a good job of helping and guiding Stack Overflow to remain a welcoming place for everyone, this is something that's been weighing heavily on our minds for quite some time, and applicable to any site that's wired into chat (AKA, all of them).

Sometimes you have problems that stay dormant for months, heck, even years, but when they flare up — it's really ugly. I'm going to make a very firm statement that I'm super proud of 97% of our chat rooms that remain some of the safest places to hang out and 'talk shop' on the Internet; you folks are doing an amazing job of helping us prove that groups of responsible people tend to bring out the very best in one another given loose rules that are often open to interpretation (see linked and related posts, too).

Unfortunately, I need to take a moment and talk about the remaining 3%1.

Those of you that regularly use chat have probably noticed that each room has a rather distinct culture. In some rooms, a little off-topic 'fun' is not only permitted, but also encouraged, and generally serves to make the culture of the room brighter, and the experience of spending time there more rewarding. In these rooms, troll-like behavior or other things that don't reconcile well with our code of conduct are quickly flagged and removed.

Other rooms prefer to keep the conversation more on-topic, with a focus that's more like a laser than a campfire. Our guidance has always been to essentially go with the flow, as long as that flow isn't something that doesn't appear to belong on our chat system, or doesn't easily come to terms with our code of conduct.

And that gets us to the hard part. It's terribly difficult and ineffective to write a list of things you can or can't say in chat.

First, you just invite a lot of rule-lawyering (the Internet version of but I'm not touching you!! I'm not technically touching you!!!) and second, new people see this oddly specific list of things like "Please don't talk about what monkeys really mean by farting" and wonder what kind of crazy people might be lurking behind the door. I could give more real, concrete examples - but let's not go there.

What positively has to function in order for these rooms to exist with our branding behind them is:

  • Stuff that doesn't belong, or that doesn't reconcile with our code of conduct is flagged.
  • The culture of our rooms must be welcoming above anything else to anyone that puts forward a good-faith effort to join and interact.

So, if we see rooms where:

  • Offensive stuff that violates our CoC isn't flagged
  • Offensive stuff that violates our CoC isn't just allowed (however tacitly, through nobody flagging it), it's encouraged
  • People are berated, kicked or otherwise harassed for holding a room's culture to our code of conduct

We're going to shut the room down permanently. And this isn't the first time we've done this.

None of this is new, and as I said earlier, problems sometimes pretend to go away while they secretly find ways to bite you even harder - I put the blame for needing to come here and reiterate all of this yet again squarely on us. But that doesn't absolve folks from the responsibilities that go along with the privilege of using chat.

Chat is a great tool, and we are really proud of the caliber of discourse that flows through our systems every day. We want to keep it available because we're really proud of what most folks do with it.

But we can't have self-policing break in the face of flagrant violations of our Code Of Conduct, and we'll be enforcing that with calm, steely-faced smiles going forward.

Questions? Observations? Anything else? Please leave an answer or a comment. We love leading when it mostly means gently guiding people to do what's good for all of us, and we really don't like it when we need to do it more deliberately. But, we're the custodians of the reputation all of you helped us to build, so we must.


1: Percentages are derived from Tim's brain via the Anecdotal3000 percentage generator implanted by Stack Exchange, Inc.

3
  • Comments archived.
    – Shog9
    Commented May 17, 2018 at 3:06
  • I am curious if anything has happened on this topic. Machavity mentioned things that would be super helpful for room owners to enforce self-moderation in their answer. Chat seems to be so low priority that I asked a question whether there will be more active development. The points mentioned in the answer by Machavity all make sense and due to recent events in a room I am owner in, I am very interested in tools like that. Would be great to get an update on this topic :) Commented Oct 5, 2018 at 8:38
  • 2
    ok, understand there have been outlier rooms and this msg is like a "big crackdown". however, am concerned about a room in particular that has very heavyhanded moderation at times, many mods present, and there have been quite a few suspensions issued from it by various mods, and now feel like mod A in particular is sometimes overreacting, too aggressively moderating, micromanaging content, & issuing suspension(s) for minor to nonexistent offenses. feel sometimes the mods are liberally using suspensions to suppress legitimate dissent over chat policy. zero controversy = bleached
    – vzn
    Commented Apr 2, 2019 at 18:25

19 Answers 19

309

So, in keeping with This Meta.SO post, I figure I might as well mention the obvious:

Chat moderation tools are terrible

Let me describe a real event that illustrates this. I was in a room where there's banter. People sometimes take friendly jabs at each other and someone crossed a line. Someone mentioned it and the poster admitted they had a crossed it too. But the 2 minute limit had been reached and the concrete was hard. This left us in a lurch. Normally messages a Room Owner dislikes just get moved to another room (like Trash or a custom trash chat room), but that doesn't delete them. The only options left were

  1. Flag it and have the user take the chat ban on the chin for 30 mins
  2. Mod flag it (mods don't primarily moderate chat so this is often slow for a rolling conversation)
  3. Hope the lurking mod in the room was paying attention

I'm a bit baffled by this, because chat is very much like Twitter in that it's a little bit of a "stream of consciousness", but lacks any ability to delete messages without a ban.

There's three things I'd like to see added here if we're going to add more accountability

  1. Give us the same options for chat that comments have. Right now we have one giant flag for everything, and if you sustain that flag, the message is deleted and the user is banned for 30 mins per flag. Sometimes things just need deletion without a ban. We fixed this for comments, let's fix it here
  2. Allow Room Owners to delete chat in their rooms. If we want internal policing, that must come with expanded powers for people we're going to hold to the fire. Internal policing can and should be the first line of defense here. If a RO sees things going off the rails, all they can do is temporarily kick people. Deleting messages would add more teeth and moderators can still see the deleted messages
  3. Allow Room Owners to issue room bans for up to 60 mins. Can't behave? You can be thrown out without a moderator needing to do anything.
20
  • 35
    You make some good points, don't ruin it with politics. Commented Apr 30, 2018 at 17:43
  • 111
    @BenjaminGruenbaum For better or for worse (you can decide which) the President of the US tells us exactly what he thinks on Twitter. But he can delete his tweets later if he wants. There's no deleting chat. That's my point. No politics here, just a well known example.
    – Machavity
    Commented Apr 30, 2018 at 17:54
  • 13
    I'd much rather be able to effectively delete messages without moving them to another room, that way it's much easier for room owners to see what the deleted message was a response to. (plus it would be nice for room owners to be able to see deleted/edited messages in the transcript much the same way we can see them in the live chat)
    – Kevin B
    Commented Apr 30, 2018 at 18:10
  • 18
    There's also no flag history in chat. I'm not even sure mods can access one. It's all broken. Commented Apr 30, 2018 at 18:27
  • 1
    @AndrasDeak I'm pretty sure mods and CMs can see flag history in chat.
    – TylerH
    Commented Apr 30, 2018 at 18:32
  • 3
    We do have a flag history for chat... it's crappy, but does exist. @AndrasDeak But you shouldn't be afraid to mod flag things... we usually handle them within a few minutes unless it's a really off hour. The bubble that announces them gets really annoying otherwise ;) Remember, the mod flags go to the entire community of mods on chat.se (~500 people) and all of the SO mods on chat.so... so, on SO, that may be somewhat slower but I'm not certain.
    – Catija
    Commented Apr 30, 2018 at 18:55
  • @Catija thanks for the info :) If I need anything flagged I do just that; this is mostly passive musing about the workings of chat (or the lack thereof). To me it's pretty important to know how a mod handled a flag on main. If I messed something up I want to see it get declined. If the user I suspected of foul play was innocent, I want to see the flag get handled with no further consequences. It just feels off to not be able to do that in chat. Commented Apr 30, 2018 at 19:06
  • 2
    @AndrasDeak Yeah, that doesn't exist in chat. We have no way to respond or create a flag history for specific users - partially because we don't have access to who flagged something unless the flag is currently active (in the case of mod flags) or if the spam/offensive flag is currently active and we're actually a mod on the site the room is parented to. Only the CMs have access to all flag history content and who flagged/validated the flags. This is something I'd love to see brought into alignment with how main site flagging works - part of my own answer here.
    – Catija
    Commented Apr 30, 2018 at 19:09
  • 5
    Two things: (1) the custom moderator flag goes out network-wide, so you don't have to worry about it being seen slowly. (In fact, you probably have to worry about the opposite, a quick-draw moderator stepping in from the blue and acting without full context.) (2) I feel like you missed an important option 4, or at least 3b. Instead of hoping the lurking mod notices, why not @ ping them? Or ping a room owner? Either can easily remove a message without prejudice.
    – nitsua60
    Commented Apr 30, 2018 at 23:38
  • 9
    They don't go network wide on SO. There are three chat servers. SO, SE and Meta.SE. Flags are limited to the servers. The 10 K requirements are per server (so I have Meta.SE and SE but not SO) and mod flags are limited to the mods on that server... so only CMs see mod flags on Meta.SE, only SO mods see SO, and all non-SO mods and CMs see SE.
    – Catija
    Commented May 1, 2018 at 1:29
  • 26
    +1 Fantastic answer. There's another reason that chats being permanent makes little sense: comments can be deleted, and this site has spent massive efforts conveying to us that chats are ephemeral, but the moment a mod shoves the comments into chat, they become permanent, and you can't even remove your comments anymore! It's pretty darn ridiculous. Let people delete chat messages!
    – user541686
    Commented May 1, 2018 at 19:28
  • 2
    @Rob Thanks for that. I've gotten mixed answers from y'all so it's nice to be certain. I'll update my mental notes.
    – Catija
    Commented May 2, 2018 at 12:31
  • 2
    @Rob Yeah. That was never a question and I didn't interpret it that way. I know the chat.meta rules. :)
    – Catija
    Commented May 2, 2018 at 12:37
  • 3
    @Machavity Actually, Trump cannot delete his Tweets. They are protected by some law that says presidential communications are history and have to go to the Library of Congress (or something).
    – Almo
    Commented May 7, 2018 at 0:24
  • 1
    @Almo I think that's still up to debate politifact.com/truth-o-meter/article/2017/sep/27/…
    – Braiam
    Commented May 12, 2018 at 17:48
133

This sounds perfectly reasonable and I support this stance. Stack Overflow should not stand for abuse nor should it tolerate it.

Thank you for this.

Questions? Observations? Anything else?

Not once has a community manager asked chat users how we're doing during the 5 years I've been here, whether or not we're happy, if we're given the tools we want or need to effectively moderate. We get tools we ask for after several years maybe.

As an example: I've asked for some tools in order to be more welcoming. This was ignored for a year and then we were told there is no problem 3 years later by a community manager. (I'll gladly provide more examples if you'd like).

I am certain that when racism, bad culture or other problems happen you are rarely aware of it.

One anonymous community manager* has been kind enough to show up from time to time to give us advice or tell us when we do things you don't like. Their advice has been helpful.

Let's have an open dialog instead?

If there are cultural issues you'd like to address with a certain room - I recommend we sit down and have an honest discussion about it.

I think Stack Overflow needs to communicate a lot more clearly to its community using the chat service.

Culture is built, not dictated

I'm going on a limb here and going to discuss the JavaScript room and the recent unfortunate interaction that resulted in this post.

To be clear - though it appears that the community at large agrees with the JavaScript room's ability to govern its on-topicness - the JavaScript room itself wants to welcome any efforts to improve its culture.

We've had members speak up against things that bother them, we've always been very open to discussion. During those few days no one from Stack Exchange talked to us. The only communication we've had on your side is on social media** and Shog's comment on an answer attacking us. (Well, unless you count the comment a community manager left on my answer and later deleted)

We've always optimized towards "let's not get Stack Overflow staff involved" because we mostly like you humans and we don't think wasting your time on a small subset of the site is worth it. We don't want to alienate you, frustrate you nor make the situation adversarial.

I would very much like to deescalate the situation as much as possible. We want inclusivity, we want to be more welcoming. That is not at odds with allowing "adult" off-topic topics like drugs and we're all up for discussion which we're doing internally but would happily do externally as well as explained in my postmortem.


* it's always Shog.

** We have decided to not respond on twitter to not escalate the situation.

7
  • 8
    Anyway, on a more serious note, how you suggest to interact with the Community Team? They are only few, and there are hundreds, if not thousands, of rooms. Or do you mean you expect the feature request you linked to will now get higher priority? Commented Apr 30, 2018 at 13:58
  • 4
    @ShadowWizard some rooms do room meetings - some rooms have culture repos, we have mods interact with those, popping into chat every once in a while and saying "how's everyone doing? Are you all happy with this?" would go a long way making things more welcoming. Commented Apr 30, 2018 at 14:05
  • 48
    Culture is built, not dictated Might be the most prudent thing I've read all week!
    – Möoz
    Commented May 1, 2018 at 0:35
  • 28
    We had an open dialog during a town hall event almost 2 years ago, and absolutely nothing changed. And there was plenty of discussion before and after. This has been a long time coming: SE (apparently) does not have the resources or incentive to build a better chat system nor the desire to "force" moderators to constantly enforce chats which are really a separate entity from the from the Stacks that host them.
    – user287266
    Commented May 1, 2018 at 0:37
  • 3
    @CreationEdge interesting, I had no idea about it - no one came to chat and told us and we don't lurk meta.se often (mostly meta.so) Commented May 1, 2018 at 7:09
  • An anonymous community manager, whose name I shall not utter here.
    – vgru
    Commented May 3, 2018 at 7:52
  • So now we need the minds behind SO to fix chat ;) Commented May 6, 2018 at 2:45
61

I'm appreciative of this move. I've seen at least one room go beyond acceptable behavior and be closed down and, at the same time, I've heard of rooms that regularly have very constructive discussions about subjects that would normally lead to very problematic behavior.

While I'm not always in chat at all times, I tend to agree that when I see flags, they tend to involve the same rooms or users - some of the latter who have dozens of short (30-60 minute) chat suspensions. This seems a strong indicator that at least some of the problem lies in our (the moderators) response and (possibly unwillingness) to give these users the longer suspensions they deserve.

How difficult would it be to automatically increase the suspension length based on all suspensions over the last n time with time being greater than hours, which it is right now?

This means that repeated offenses in a week/month/year would be automatically recognized and the suspension would more directly correlate with continued poor behavior without the intervention of a moderator (in the case of 10k flag message deletion). It's worth noting that removing a chat suspension is pretty simple in cases of poor flag handling.

These users shouldn't be allowed to continue chatting but, often, the moderators may miss these events entirely as the only historical record of it is on the user's chat profile or on an admin page that many mods are likely unaware of that wasn't really intended for general use.

I really don't like putting more work on your plates but I think these things would help:

  • improved auto-suspension escalation for repeat offenses
  • improved moderation tools/chat flag history
  • improved chat FAQ page (it's sorely out of date - I'd bet some mods would be happy to help draft it :cough:)
  • better guidance for moderators on how to respond to repeat offenses from the same person/group of people

Thanks, again!

13
  • 11
    Auto-suspension escalation simply based on the size of the suspension history inevitably puts more weight on flag-bans, which I am afraid is a mixed bag of genuine flags, troll-flags, flags which are "invalid" (eg, a certain user thought this or that message might be rude or abusive to this and that user, but in reality it wasn't intended to be - a confusion which could be easily resolved by discourse prior to flagging), and not to mention mishandling of flags by sloppy 10k+ers. Commented May 1, 2018 at 9:30
  • 4
    Given that, I don't think that's necessarily a good idea. In general there's no reason to expect the suspension log in toto has any correlation with repetitive offensiveness (as in, that's a completely statistical claim and not obviously true). There should be an auto-suspension escalation based on the subset of the suspension log that's invoked by some moderator, however. That's surely a better measure of a pattern of offensiveness throughout time than the whole chaos of the suspension log. Commented May 1, 2018 at 9:34
  • 4
    It's not in total. It's validated only in n time with a way for mods to clear the flag history for a specific event. If someone cleans up their act or if the flags are invalid, they start dropping off. I don't really think what you're saying is particularly common or difficult to control for. Most users with a flag history are not trustworthy chat users.
    – Catija
    Commented May 1, 2018 at 11:45
  • 4
    Couple that with flag accountability, so that you can see who's casting those troll flags and you can turn the tables on the people trolling chat flags without asking for CM help.
    – Catija
    Commented May 1, 2018 at 11:49
  • Ah I see, so you get to control the time-endpoints of the flag history. Fair enough, that might be something. I don't agree with the conclusion that "most users with a flag history are not trustworthy chat users"; I'd think even if that's true the percentage of reliable chat users (whatever measure of reliability you'd choose!) with a nontrivial flag history varies room-to-room: that's exactly where the statistical effect of room culture kicks in, a concept which the moderators don't like very much, I guess. Commented May 1, 2018 at 12:08
  • 2
    The CMs/balpha would control it - by designing it and nudging the parameters if necessary. For example (very rough) if you have one validated s/o flag in a day, nothing changes (30 minutes ban). Get a second one would increase (say, 3 hours). Third in a day would hit harder, 24 hours). From there, it scales up. Get three in a day twice in a week and you're out for a week instead of a day... but if you don't, you start fresh... ish. Anyway, this is designed to catch really troublesome users who regularly fail to be nice. This is pretty uncommon.
    – Catija
    Commented May 1, 2018 at 12:17
  • 4
    I do understand your concerns because I've seen the flags in the room you use primarily... I assure you, for the chat.se server, that room is an extreme outlier. I'm sure you've had problems with trolling flags which is why fixing the mod tools for chat is part of this answer.
    – Catija
    Commented May 1, 2018 at 12:21
  • 1
    Thanks for elaborating on the idea of the mechanism! That seems well thought-out, so I redact my skepticism for the time being. Also, interesting. I suspected my room is an outlier but your use of adjectives seem to suggest it's far from a good predictor of how chat.SE on the whole behaves. Thanks for clarifying! Commented May 1, 2018 at 14:11
  • I take it you're also talking about the cumulative aspect being chat server wide? I didn't see mention of it (though I could have missed it). The reason I bring this up is some people cannot just leave well alone. A "known" user could bring their nastiness from one chat to a different chat after being banned. Not sure how it works currently, but would want to ensure it covers all the bases. Commented May 1, 2018 at 17:46
  • 3
    @Pᴀᴜʟsᴛᴇʀ2 Yes. If you're going to drag your nastiness around with you, much like your suspension is server-wide, so too are the risks of behaving badly.
    – Catija
    Commented May 1, 2018 at 17:48
  • 4
    re "problematic users". suspect there are some basic statistics about some users chatting way more, but also getting more flags. so would also like to look at the frequency of flags ie flags per total lines in chat. think there needs to be some positive acknowledgement/ attn to "regulars" who sustain/ anchor rooms and thereby positively impact/ drive indirect engagement to SE in general, and yet gain no rep for it instead of just negative attn to occasional flags.
    – vzn
    Commented May 1, 2018 at 20:45
  • 3
    @vzn Sure. That's reasonable. That said, I don't think there's a huge correlation between chatting more and being flagged more. There's some data in chat... and Mods can see the users' suspension histories but more of the most active users of all time in chat have fewer than 10 all time chat annotations/suspensions than otherwise. chat.stackexchange.com/users ... but some of them have really problematic histories, too. :) We can't see to the granularity of flags, though, only suspensions, which would be nice to change.
    – Catija
    Commented May 1, 2018 at 20:55
  • You should draft that FAQ yourself, that's what we're doing here basically. Right now is the time to propose your version of it, gather support and with a higher degree of probability than ever get it implemented. I wouldn't wait for another chatpocalipse if you can do it now. Commented May 3, 2018 at 22:26
44

Well - I guess its about time. I don't really have that much newer info than I did the last time we talked about this three years ago.

The chat flag system is still a little odd; there's no real "out of band" way to keep track of problem sites and users.

One thing I think is an issue is how there's no real "framework" for what suspensions should be. Contrast the mod message system on main sites with how suspensions work on chat - with arbitrary numbers of hours settable.

In a sense, outside the rooms we're regulars in we don't really have context. A random moderator might not know a certain user has a tendency to kinda be on the borderline or even offend regularly. A mod who is a regular would throw the book at him. A non-regular one would need to go through the user's suspension record and decide.

And while it's not quite how things have worked lately - we kind of need a backstop. We do realise that our CMs are busy, but sometimes folks may need help from someone with a little more authority to sort it out. Sadly, effective chat moderation is instant or close enough, and it would be nice to have someone to ask folks to knock it off, or even hang out.

If there's one thing I've learnt being a RO and mod on root access - effective moderation starts with the community. For the most part we know what we need to do. Bad chat moderation usually ends in fire.

Also having been appraised of the situation - I do realise that it's a potential PR issue - but if a CM's going to talk about a chatroom on twitter, could we have someone pop by and talk to the room please? (If it has happened, well, awesome, but it should be a policy. "I've talked to the folk involved, and ______" would be such a nice thing to see). Communication and setting expectations is very important here.

14
  • 5
    I do agree that communication with local admins (mods/ROs) is preferable, but in the context of that tweet and the linked chat conversation, I think the room acted really badly. If someone (even or especially a newcomer!) is offended by a topic of conversation, people should drop it and change topic rather than arguing and accusing that person of trolling. I was especially disappointed to see this response by a moderator. Commented Apr 30, 2018 at 17:06
  • 10
    @Randal'Thor "If someone (even or especially a newcomer!) is offended by a topic of conversation, people should drop it and change topic". I'm not so sure that this is so obvious. It might be true on SO chat because we still have Be Nice and the parent site is purely professional, but in a generic community standpoint if you enter a room full of people who are all riding unicycles and you hate unicycles, you should rather leave. Now, this is an awful strawman; I just wanted to note that the fact that someone perceived something as inappropriate shouldn't automatically imply a shut-down. Commented Apr 30, 2018 at 18:32
  • 7
    @Andras True, that statement was too absolute. As always, there's a line to be drawn between "X is offended by Y spouting racist memes" and "X is offended by Y using the word 'tomato' in conversation". A hard-and-fast rule that claiming to be offended is enough to get a topic change will never fly, because it could be abused by trolls. But don't you think prostitutes (especially on a programming site) falls on the "could reasonably be offending" side of the line? Commented Apr 30, 2018 at 18:37
  • @Randal'Thor yeah, I wasn't objecting to this specific instance, only the absolute :) I agree with your line-to-be-but-never-will-be drawn. Commented Apr 30, 2018 at 18:40
  • 12
    I think it's also rather important in my opinion to distinguish between "This offends me" and "this might offend someone". The latter can be said about just about anything. The user didn't enter the javascript chat claiming that the discussion offends him, he said it was unwelcoming.
    – Kevin B
    Commented Apr 30, 2018 at 18:40
  • 2
    @Randal'Thor I'd like to point out to the fact discussion was stopped pretty immediately - what pursued was discussion about the discussion itself. The offending discussion and what that mod is talking about is whether or not discussion of sex is allowed which can technically constitute a discussion of sex but there is a big difference in my opinion. I agree it wasn't perfect and there is some unfortunate historical context - the room was recently trolled (quite a lot) by people pretending to be new comers but who were actually the same repeat troll. Commented Apr 30, 2018 at 18:43
  • Also, it's really important for me to note I am neither agreeing or disagreeing with your criticism in the comment above - I am just hoping to provide some context around it. Commented Apr 30, 2018 at 18:45
  • 1
    A good way to look at this is, would a hypothetical reasonable person be likely to be uncomfortable or be offended by encountering the subject of conversation and/or the way the subject is being expressed, in that setting? I have heard many people express the idea that they no longer visit Room X because the conversation there frequently makes them uncomfortable; that's a shame. People shouldn't have to know about a site's rules or policies or Github to understand why conversation a reasonable person would find out of place is in fact "ok".
    – Rubio
    Commented Apr 30, 2018 at 20:51
  • 4
    Someone, inartfully perhaps, expressed their nonetheless completely appropriate opinion that the conversation they walked in on made either them, a reasonable person, or both, uncomfortable— particularly given the current climate on SO, which they referenced by linking to the blog post. Sure, that topic ended, but it wasn't ended with "Oops, you're right"; it was instead replaced with what amounted to "Who are you to tell us what we can talk about here? You troll! Get out!". An appropriate calling-out, even expressed inartfully, is no excuse for inappropriate defensiveness or abrasiveness.
    – Rubio
    Commented Apr 30, 2018 at 20:56
  • 2
    Regarding the tweet: No, the relevant room wasn't contacted prior to, or after that tweet.
    – Cerbrus
    Commented May 1, 2018 at 6:57
  • 22
    It's kinda amusing that SO has spent years essentially ignoring chat and ignoring people imploring them to improve the tools to make moderation easier then suddenly goes all out nuclear. That's what happens when you neglect a feature. Don't do that.
    – Rob Moir
    Commented May 1, 2018 at 9:13
  • 2
    @BenjaminGruenbaum Did you happen to read far enough down the transcript to see what they said about the guy after they kicked him from the chat? Helpful link "Chat was mean to me" "inb4 come back in and calls room jerks" "I think he's gonna use the term bigots" "I can hear him heavily breathing and angrily typing it right now" "He just said 'thanks for that folks' [and was kicked again]." So welcoming. Commented May 2, 2018 at 21:55
  • @Draco18s I've read that part and my take on it is that some troll came in, since it's a new account, so they treated them like a troll and made fun of their allegedly trolling efforts. Making fun of others is not a noble thing to do in any scenario, but it's an especially risky thing to do if there's a large change it wasn't a troll, just a slightly misguided user. I know it's pretty common for people in very high throughput rooms to become "toxic" like that, basically 90% something like that happens, it was a troll. But that 10% though. So yes, we have to fight this "toxic by default" mood. Commented May 3, 2018 at 14:39
  • 1
    @user1306322 You're right that it could have been about an actual troll, but it wasn't. And even if it was...you're right, its not appropriate even if it had been an actual troll. Commented May 3, 2018 at 16:16
40

Chat self-moderation starts with the individual user. We've been talking about ways to make moderation easier (and that's desperately needed!), about ways to empower room owners more, about automatic suspensions with teeth... but the individual user, the author of a chat message, lacks one important ability: after two minutes, whatever you said is permanent without moderator intervention.

We need to allow users to say "that thing I said was pretty stupid" (or didn't come out right, or was the scotch talking, or whatever), and let them clean it up themselves. We allow users to delete their own comments at any time, even if it might make other comments obsolete or puzzling; if they can do it with comments, why not in chat?

In comments Shog raised concerns about abuse; apparently there have been cases where people have done creepy things in chat and then tried to cover up the evidence. I think we can find a workable place between "locked in at 2 minutes" and "can delete everything and hide the bodies". Perhaps (just as a starting proposal to be refined), we limit people to five chat deletions per day and only allow deletion of messages within the last day. This allows the user who realizes (but not immediately) that he messed up to fix it, without opening the door to widespread deletions.

We could also raise a moderator flag if somebody deletes some threshold number (or percentage) of recent chat messages, like the flag we get for a sufficient volume of comment deletions. If somebody deletes the allowed five messages a day, every day, we probably want to notice that. This auto-flag isn't so that the deletions can be reversed; neither chat deletions nor self-deleted comments can be reversed by moderators. It's to let moderators know that there might be a concerning pattern to look into.

We could also log post-two-minute deletions in the chat user's history, alongside the flags, to make it easier for mods to see what users are deleting hours after the initial post. This would make it easier to review attempted coverups of creepy behavior.

21
  • 12
    if we wanted to get really fancy... Add some sort of "tap on the shoulder" feature that'd give the author a heads-up that they'd just made a blunder and gave 'em the opportunity to delete it. Escalates to a flag if not addressed in [time].
    – Shog9
    Commented May 3, 2018 at 2:46
  • 4
    The problem with unlimited deletion is rage-quits and straight-up hiding of abuse (think: those creeps who set up chat rooms for solicitation). There's no "undelete" for chat messages; that'd have to be added, or a rate-limit for deletion so severe as to make the feature of limited use.
    – Shog9
    Commented May 3, 2018 at 2:50
  • 6
    Are chat abuses more common than comment abuses? (We can't undelete those comments, either.) I wouldn't have a problem with a limit per day on deletions; if somebody is doing this more than a handful of times, he probably needs to rethink his chat use anyway. Yes, that would mean more tooling. What can we do that's better than "permanent after two minutes" without enabling the abuse you're worried about? I like your "tap on the shoulder" idea; it's kind of like the suggestion to show flags to the author first. (I know that's been suggested for comments; don't know about chat.) Commented May 3, 2018 at 2:55
  • Comment threads tend to be a lot shorter than even brief chat conversations. And... There's a flag that gets raised if you delete too many comments. Neither are common, but chat stands to be potentially more disruptive. For abuse, that unfortunate has been common in the past, complete with using deletion to try & hide - of course, they couldn't hide stuff older than 2 minutes, so it got caught anyway.
    – Shog9
    Commented May 3, 2018 at 2:59
  • 2
    ...now that I think about it, chat was waaaay creepier just a few years ago. We're not exactly doing great right now, but at least some of the worst stuff is no longer commonplace.
    – Shog9
    Commented May 3, 2018 at 3:00
  • 1
    I did suggest raising an auto-flag if there are too many deletions (as an extra). Maybe that's not as optional as I thought and needs to go with the ability to delete. Or maybe we just need to lengthen the window from 2 minutes to, say, a day (you come back after sleeping on it and realize how inappropriate you were). Maybe log deletions with the chat annotations, for easier auditing? Commented May 3, 2018 at 3:01
  • 5
    I think we could probably swing a day window combined with a flag that looks specifically for deletion >= new not-deleted messages over a longer time period (two days to a week). So if you flip out, come back sober & clean up, fine. But if you delete almost everything you post on a regular basis... Bad news! But yeah - the UI for reviewing these would need to be a lot better than what we have now.
    – Shog9
    Commented May 3, 2018 at 3:06
  • @Shog9 if you decide to implement this, then please do a multi-tiered visibility escalation: first the room owners see warnings about potentially problematic deletions (room owners can decide if it's not a problem so as not to bother mods or send it straight to them without delay (or even temporarily suspend user's deletion ability and raise a flag, like kicks)), then moderators of that site the room is tied to, then if it's not dealt by anyone in either of previous steps, escalate it to global mods. And probably don't show it to just all 10k users who obviously can't do anything about it. Commented May 3, 2018 at 14:14
  • Y'know that doesn't really work for the case of a RO who creates a room to harass others, @user13
    – Shog9
    Commented May 3, 2018 at 14:40
  • I'm not sure what exactly you speak of there, but I'm sure the code may check if the deletions were happening by a room owner, and so the stage of showing the deletions warning to the room owners would be skipped, and straight to local mods, then global. Commented May 3, 2018 at 14:43
  • 1
    I'm curious how many user-deleted comments cause a flag @Shog9 because I've never seen that and I found a user doing that frequently without our notice... so either another mod cleared it without saying anything or...?
    – Catija
    Commented May 3, 2018 at 17:48
  • You can limit the number per day, too. If someone can delete messages but only, say, five per day, that makes killing entire histories very tiresome... but allows the occasional "whoops" message. Using all of them in a day could trigger an alert "we see you're deleting a lot of messages, please flag for moderator assistance if you need help".
    – Catija
    Commented May 3, 2018 at 17:52
  • 1
    The threshold for the flag is pretty high, @Catija; it's a very rare flag even on SO. This touches on why the same technique is bad for chat: the use-case is almost the reverse of for comments. On main, there are only a few scenarios where you don't want folks deleting their comments; comments are never really an end-goal. In chat, messages ARE the goal - so there are really only a few scenarios where you want them removed. Making message deletion easy opens the door to far more unwanted uses than it does wanted ones... And I don't think anyone wants a massive queue of mod-flags in chat.
    – Shog9
    Commented May 3, 2018 at 21:45
  • 2
    @Shog9 But we have gotten the occasional "can you delete this" mod flag before and they're usually handled as requested. They're not my favorites but it's at least an option. My main concern about this is that (while rare) some of these users with a habit of deleting comments aren't particularly nice in those comments (and I'm not talking about my own past experience). And I've seen users on chat intentionally deleting their chat messages within 2 minutes to make it difficult to follow their discussion, which was also not great. I don't want entire chat rooms like that.
    – Catija
    Commented May 3, 2018 at 21:49
  • 1
    Even there, it's not like we don't want non-nice comments deleted - we just want to know that someone's being persistently rude in addition to the comments going away, @Catija. In chat, pure offensiveness - while a problem - pales next to the damage that both predatory behavior and vandalism can cause, and both scenarios are exacerbated by the ability to delete messages long after they were posted.
    – Shog9
    Commented May 3, 2018 at 21:54
29

Please tell us more directly what you want from chat behavior. Communication through concrete examples would make things so much simpler. Using examples doesn't mean giving an exhaustive list, it means giving people something to extrapolate from.

If a room is going in a bad direction, come into the room and tell people that there is an issue and explain what the issue is.

I would like to see concrete examples of behavior that have crossed the line. I understand what the rough intention behind the "be nice" policy is, but I don't know where the actual line is. Real examples would help a lot when moderating chat. If you want to keep the bad examples to moderator eyes only, that's fine.

9
  • 1
    While I can't speak for others, there are concrete instances of room removal for reasons described more clearly here. From what I gather (not having witnessed the behaviour in the room), the main reason why the room was deleted is not multiple users engaging in disruptive behaviour, but from multiple users not listening to others asking them to stop said behaviour. Commented May 1, 2018 at 12:00
  • 7
    @Discretelizard Pointing at a whole room is not very specific. Excerpts from the transcripts clearly pointing out specific symptoms of the problems would be great. Commented May 1, 2018 at 12:08
  • While those discussions don't have transcripts, the meta posts do describe what lead to the room removal. I suppose you would have to take Shog's word for it as for what really happened, but what would a transcript really add? Is it not clear from the descriptions given what should and what shouldn't have been done to avoid room removal? Commented May 1, 2018 at 12:11
  • 2
    @Discretelizard a transcript would let each user read it, learn and decide on their own. We would have concrete examples of specific messages to study and talk about. I don't understand how that's not obvious to everyone. Commented May 3, 2018 at 17:32
  • 2
    @user1306322 Well, I'd rather have a summary by a mod that explains the systemic problem than sifting through transcripts to come to the same conclusion. While I do understand that you'd like to understand this example precisely, I think the value in the actual content wrt to the lesson to learn from it is minimal. But it seems we disagree here. Commented May 3, 2018 at 19:25
  • 1
    @Discretelizard I am not against a summary by any moderators who wish to share their point of view, quite the contrary. I like it when we have all the good data to review published and open. I don't like it when we're basically told what to think without any concrete data to draw our own conclusions from. Don't you agree with that idea? Commented May 3, 2018 at 22:07
  • 1
    @user1306322 In general, being able to test the assertions of others is good. However, I think that the reasons the room was deleted and not merely closed weighs stronger than letting people that were not involved in the particular chatroom test whether the actions actually occurred, when we have the meta post explaining the important points. In other words, I don't think the exact transcript is any of our business. You may disagree with this, but this is why I think the current situation is fine. Commented May 4, 2018 at 9:23
  • Maybe it is fine now, but it could have been resolved earlier if the chat moderation tools were improved sooner. Now we don't have the option of referring to that incident and giving specific examples on how that situation could have been handled with better tools. So we don't have useful information to base our new propositions off of. It's certainly a loss, but I hope we can manage to push the improvements without that. As I said, there are many sides to withholding seemingly "useless to the public" information. Commented May 4, 2018 at 12:11
  • 3
    @user1306322 In that specific instance, chat moderation tools were practically unused. Bans were extremely rare, usually short, with longer bans often lifted by more lenient moderators, and the room frozen only once or twice ever. Not only were the chat moderation tools poor, but there was a fundamental lack of agreement on how to use them or whether to use them, with room owners, local moderators, and off-stack moderators often having very different opinions on how to handle any given situation. It wouldn't have mattered what tools were available, because at its core it's people problem.
    – user287266
    Commented May 5, 2018 at 8:32
26

I've only ever spent very little time in chat, so maybe I'm missing some important context, but I feel strongly enough about this that I think it deserves to be an answer, though it has been mentioned in the comments.

I'm distinctly getting the impression that this will be a one-strike, no-warnings policy. This seems fundamentally unfair. Certainly a chat room "going bad" is a gradual process, and there ought to be several red flags along that process where it would be natural to give mods a warning of "get your things together, or we're going to have to delete you". In this way, a room which might be beginning to display bad tendencies could be guided back towards good behavior, and the necessity to annihilate could be avoided. Of course, rooms which refuse to comply will have to receive their due consequences.

Other people have mentioned strengthening chat room moderation tools. Again, I really don't have enough experience in chat to attest to any lack thereof, but enabling moderators to do their job better would certainly go hand in hand with encouraging moderators to do their job better, rather than preemptively pulling the plug with little to no warning.

Aside from this, I'm definitely in favor of deleting chat rooms that refuse to comply with the Code of Conduct; I just feel we need to take care to strike the right balance and not be needlessly punitive where the necessary action might not be so drastic.

2
  • 2
    Hm, I would note that the criteria for shutting a room down aren't sudden or single-strike. For example, in order to conclude that people permit or encourage offensive things, it takes time to see multiple examples of it and conclude that it's a pattern. It's definitely totally normal for there to be warnings, too, I've seen plenty of that - and if anything this announcement should make mods more likely to warn folks before it gets too bad.
    – Cascabel
    Commented May 1, 2018 at 21:49
  • 3
    The thing to know about the lack of moderation tools in chat is only moderators can delete other users' messages. Room owners can only move messages to other rooms (usually "trash" or similar) but someone may still go there and find it, flag and the user would get suspended, which is dumb. Basically we have to have moderators online at all times for the complete self-moderation to work, which is not realistic. So we'll have to ask the chat developers to give room owners the required tools, since they're in chat most of the time, unlike mods. Commented May 3, 2018 at 0:34
14

I appreciate this motive about being aggressive towards the chat users who are regularly breaking "Be Nice" policy and being problematic for constructive discussions in chat rooms. I also support action against chat rooms which attract flags and controversies on a regular basis. This also makes chat a useful place after main and meta sites. (Some users in the past said in their answers that chat is not much useful).

It is said that the rooms where code of conduct is violated will be permanently shut down.

Now what is offensive in chat is not as clear as it is on sites. It is depending purely on the luck of the flagger and fate of the flag on which moderator it reaches.

There is no better and uniform guidance to the moderators.

  • When to take the policy literally?

One of the points in the Code of conduct is:

Name-calling. Focus on the post, not the person. That includes terms that feel personal even when they're applied to posts (like "lazy", "ignorant", or "whiny").

Name calling. It specifically black lists the word "ignorant" which may feel personal. Of course, this should not be taken literally when there are funny and hilarious conversations. This is about pretty serious conversations when one of the users leave the conversations without any word. I have read some conversations like that where there were repeated use of the word. I asked the moderator and they replied that the word should not be taken literally but with respect to context of conversation and topic. They didn't remove those messages saying they (mod) were not referring to the opposite person. It is fine to say "ignorant" within the context of that specific site and said that site specific rule applies more than general policy of Stack Exchange. So, I had nothing to say more. Left silently.

  • When to take the flags with respect to context?

Bigotry of any kind. Language likely to offend or alienate individuals or groups based on race, gender, sexual orientation, religion, etc. will not be tolerated. At all. (Those are just a few examples; when in doubt, just don't.)

Not before the recent Stack Overflow blog post, I had another conversation where there were some personal comments about a user. They are not literally rude if we see individual chat messages but definitely problematic if read complete conversation. I flagged one such message without a second thought. The flag didn't survive a moment. I didn't flag any more messages because I know the result. Those messages survived. I had nothing to say to the user who typed those messages except "Be Nice". (Where in the chat is aggression except in your words and my thoughts? Those messages received stars instead of flags :/). Left silently.

Some Room Owners do not care about moderation

Some Room Owners ignore such conversations even took into their notice. Moderators don't visit chat often to check the history and conversations. Some messages look controversial but sometimes they say the reason of Room Culture. I also Room Culture should be taken into consideration while taking actions but how many times? Sometimes, it is repetitive but the action taken is minimal. Deletion or moving of the messages happen. Users continue such behavior again. Some only create rooms to talk about a separate and topic of the site. They don't care about moderation. This is also a reason that many messages are not flagged and brought to notice of a moderator.

I believe there will be many instances like that where there is no active participation of Room Owners and moderators in the chat. These conversations happen when there is absence of moderators. Chat flags are not effective all the time. Room Owners have limited tools to action against such conversations. So, to be honest, self moderation is not going on in many chat rooms where mods are not around. It is only going in active rooms where there are always 2 ROs and a moderator to check what is happening around in the room. I also believe that these incidents happen in some of those 97 % of the rooms. It may not occur regularly but this is not occurring once or twice in two months. If the team has taken the decision of 3 % of rooms, they should look at unnoticed 1% 1 in the rest of 97 % also.

Lack of awareness of responsibility of a room owner

Some users are not aware of the responsibilities of a Room Owner. They do not know that they should also moderate the chat room in the absence of moderator and when there are some problematic messages in the room. This is also another reason for messages are not being flagged.


So, here are my requests:

  • Uniform and accurate guidelines to the moderators on when to take actions on users.

  • More tools to RO to take actions on problematic users. Removal of Room Owners if they they have not visited the chat for a long time. Like the system selects a new owner based on activity, there should be some process to remove them too.

  • Community Managers and moderators should check flags and conversations of the rooms even with less activity.

  • Update of Chat FAQ. The chat FAQ is still old. The ancient version of mobile chat updated to new version but the FAQ are still old and outdated. They need to be updated.


1. Percentages derived from Nog's brain from his experience on chat.stackexchange.com

4
  • 2
    Also, no automatically-assigned ROs. RO should be opt-in, and if a room can't muster anybody willing to do the job, that's a situation that requires a closer look. Commented May 2, 2018 at 15:54
  • Speaking about flags doing nothing: I know that I've occasionally been in a chat room, had the site say "hey, review this flag" and I'll get an out of context message from some other chat. I look at it and go "I have no idea wtf this is" and click the ignore/skip button. That is, I see nothing in the message that seems to violate any sort of "be nice" rule, but don't know where its coming from either. Maybe there is something about it that's offensive, I just don't know. Commented May 2, 2018 at 21:41
  • 1
    @Draco18s sometimes flags are abused for advertising something to all the other 10k users in chat. The intention was not to attract attention because something is inappropriate, but just to attract attention. It's a misuse of the flag feature and we don't have any tools for combating it. Commented May 3, 2018 at 17:34
  • @user1306322 The ones I've seen didn't look like advertising either, just a random chat message. Wish I had an example to give, but I don't remember the content at all. Commented May 3, 2018 at 20:39
11

The concept of "normal user", "room owner" and "moderator" needs to be expanded into more granular abilities: (imagine file system per-user rights, it's a very similar concept)

  • Room owners who can change other room owners' and users' abilities (so you can have some users with limited extra abilities and who can't change other users' abilities)
  • Delete own message older than 2 minutes (up to 48 hours/older) — (automatically granted if you have > 10k rep site-wide, like being able to see chat flags, but may be explicitly allowed by a room owner)
  • Delete any user's recent message in the room (up to 48 hours of age) — there's a case to be made about not so frequently visited rooms, but we'd have to collect more data from such rooms' regulars to figure out their needs
  • Delete any user's message in the room

Deleting messages should be possible in ranges, same as with moving them.

Flags should only propagate outside the room or site after 3-5 minutes if no action is taken by users and if there are no online moderators who can see it in mentions and come into the room to figure it out.

2
  • 1
    The question that should be being asked here is how to promote self-moderation, but it doesn't matter if we don't have the tools to do it (fair warning though, the permanency of chat is one of the reasons I second guess before posting one). I never want to be a RO however there were times, in a galaxy far away, I could've used the tiniest bit of authority, to the benefit of myself and others. +1
    – Mazura
    Commented May 3, 2018 at 5:20
  • The specifics of my favorite room closing are still unclear to me but the FFS, Flags! Party shenanigans didn't help one bit. I've heard this OP's sort of PSA from mom and dad before... Don't make me come up there... - Then don't have a baby monitor in our play room. At least put it on airplane mode. - No more flag nonsense airing dirty laundry all over the site. +2
    – Mazura
    Commented May 3, 2018 at 5:35
9

I rarely use chat and I have no opinion on how it functions, but I do wonder about the language used here. Is it really necessary to enforce self-moderation aggressively? Could one not enforce it carefully or diligently or even cheerily?

I understand the word 'aggressively' is meant to convey the seriousness of SE's intent, but must that really be brought so ... aggressively? It might be a cultural thing: in American shows nowadays there's a tendency towards aggressive language, even in comedies people are all the time killing it or crushing or destroying someone. It gets the laughs.

But SE is not going for easy laughs here, just making a statement and giving it some extra charge by referencing violence. I notice it and it bothers me. "We're more actively enforcing self-moderation in chat" gets the message across just as well.

2
  • 1
    As an Room Owner of an pretty active chatroom on Stack Overflow let me say that carefully, diligently and cheerily are used, tried and tested and I prefer those (and the room I moderate does reasonably well) before using stronger messages. However, RO-teams and site-moderators can only do so much to tame a crowd. Someday a line has to be drawn, and this one is drawn crystal clear. It is unfortunate but that message needs strong words, making users involved a bit uncomfortable. I agree that this shouldn't be needed but I personally don't see another option given past incidents in various rooms.
    – rene
    Commented May 4, 2018 at 12:35
  • 1
    let me add that I strongly believe these measures are not being brought in place to scare you or other users away, quite the opposite. They are meant to guarantee that SE sites can be a safe place to visit with respect to all of each users, not just a small subset. Striking a good balance is maybe the struggle here.
    – rene
    Commented May 4, 2018 at 12:38
7

The problem I have with this, and I suspect part of the reason the rules have to be really vague, is that there are (at least) two completely different kinds of chatrooms.

  1. "Third place" chatrooms. Long-lived rooms where people can go to hang out and banter. These places work best when they are welcoming, and it makes perfect sense to be a stickler about the "Be Nice" (and related) policies in here.
  2. "Take it outside" chatrooms. These happen because two or more parties got engaged in a discussion/disagreement in question/answer comments, and some sensible person hit the "move it to chat" button. A large part of what triggers this is discussions getting acrimonious. Now obviously there are boundaries (eg: flat out slurs or name-calling), but otherwise the Be Nice rules need to be a little looser here to give the parties involved a chance to work out their differences and cool off in a less public place. As a mod, if I didn't have the ability to take such discussions "outside", my only recourse would be mass-deleting comments and locking posts so the deleted points don't re-emerge.

In short, I don't think the rules of behavior can, or should be identical for proper posts, Third Place chatrooms, and Post Argument chatrooms.

I realize reading over the question that likely what isn't being said is that some "Third Place" chatrooms were regularly getting content far outside the standards I outlined as appropriate for them under point 1 above. But part of my point here is that the need for things to be looser in Post Argument chatrooms is confusing things here. Perhaps if the software made some kind of distinction, it would be easier for us humans to do that somewhat objectively as well? There are some things that really ought to be functionally different for Post Argument chats too (eg: they really shouldn't get deleted as quickly, because then people who come late and want to join the discussion end up doing it back in the post comments. Arg!!)

4

I just need to drop my view cents here. I already wrote a comment:

Ah yep I can confirm that you close rooms in that case, unfortunately you suggest that it would be possible to reopen chats. However you didn't give a final statement for more then 6 month, that you will leave the room closed. That is how you lost my trust in community management. It is valid and correct to close rooms for good reasons, but don't imply that there are chances to change that if you already decided to leave a room closed.

I really want to give some more background for my comment. I was one of the room owners of “Android Era with Kotlin and Java”. In that room happened some shit which is not excusable. The majority of the room owners where informed that the room will be closed some minutes before the room was closed. It was implied that the room will be closed and if we would add some good reasons the room may be opened again. We never got a final answer; now it does not matter at all. In the end I am sure that the room was closed by missing moderation tools.

I'm sure that the room had been saved for good if there would be tools to enforce "room rules". Like no gifs, per room configurable ban of users which where simply not welcome in a specific room. A more intuitive explanation of the timeout feature. A official bot API would be nice for many reasons (e.g. for fun or documentation lookup).

In the end chats can be a good thing with the right people or can go to the dogs with the wrong users.

16
  • 1
    "For fun" is where problems would start to pop up first. Commented Apr 30, 2018 at 19:37
  • 3
    Some rooms do have very strict per-room rules and they are codified on a webpage (or GitHub Gist) and the ROs are expected to enforce those rules. But that doesn't necessarily require any additional support from the network. For example the "chatiquette" for the PPCG main room, The Nineteenth Byte. Also, be aware that there's a great chat room for the SOBotics group where many users work together to create bots for chat. If you haven't talked to them, consider it!
    – Catija
    Commented Apr 30, 2018 at 19:43
  • 1
    @AndrasDeak with great powers came big responsibility like in moderation in general. Take e.g. the smoke bot it is very helpful. In my opinion it was somehow fun to use the bot.
    – rekire
    Commented Apr 30, 2018 at 19:43
  • 1
    @Catija possible. I don't know the rooms, I'm just talking here about my historic view.
    – rekire
    Commented Apr 30, 2018 at 19:46
  • 1
    Just open a new (and positive) room? Just so you know - the policy of Stack Overflow is very relaxed towards bans. At the Node.js org users who are abusive in any way are usually banned for life (or until they reach out respectfully and discuss how to regain our trust). If you want to make amends then prove it - you can open a new room and build a great constructive and culture of it. Commented Apr 30, 2018 at 20:08
  • 8
    "So, if we see rooms where: Offensive stuff that violates our CoC isn't flagged, Offensive stuff that violates our CoC isn't just allowed (however tacitly, through nobody flagging it), it's encouraged, People are berated, kicked or otherwise harassed for holding a room's culture to our code of conduct: We're going to shut the room down permanently. And this isn't the first time we've done this." - Android Era with Kotlin and Java ticked the box on two if not all three of those, so shutting down the room permanently fits within the described policy. Multiple warnings were issued before this. Commented Apr 30, 2018 at 20:33
  • 1
    @BradLarson yep the room had downsides which are absolutely correct that it was closed on one hand. On the other hand there was initially a very cool atmosphere which is gone for ever and cannot be recovered, however it is very likely that a theoretically reopening would never bring that back. Just to clarify I don't fight for the room. However with the history in mind I feel bad to write anything non absolutely relevant to the room, since I really want to avoid that I am somehow part of that a room get closed.
    – rekire
    Commented Apr 30, 2018 at 21:23
  • 1
    @BenjaminGruenbaum I didn't felt banned at all. So yeah I was thinking about to open a new room, but as a clear statement that I take the consequences I never opened a new room again. I also guess that I don't have the time to moderate a room today. The spirit which I mentioned earlier is gone and also my motivation. The good old days... I feel old.
    – rekire
    Commented Apr 30, 2018 at 21:27
  • 6
    If you have to bring up that room: the problem there wasn’t lacking moderation tools. It was lacking moderation.
    – Cerbrus
    Commented Apr 30, 2018 at 21:33
  • 1
    @Cerbrus hey it's you again. Great that you are still active. I really thought that I pointed that out of my view, it's really not worth for me to wast more time regarding that. We should just agree that we will never agree regarding that room. It's somehow funny I guess you are the only person where I think so.
    – rekire
    Commented Apr 30, 2018 at 21:38
  • ”In the end Iam sure that the room was closed by missing moderation tools — There, you are implying the problem was the available tools. I commented in order to correct that.
    – Cerbrus
    Commented Apr 30, 2018 at 21:42
  • 1
    Dear @Cerbrus yes I think so. If I would have be able to suppress gifs I would have been more active in the chat. Since my felt risk of getting bad attention (in the office) for flickering images on my second screen I stopped having the chat always open. So if that is not possible kicking 1-3 people out of the room would have also gave me the power to keep that toxic people out of the chat. Problem done. I know that you don't see that this is causal dependency, but I do. It cannot be so hard to understand why I think so. FYI I'll avoid to look here again as I don't want to get upset by you.
    – rekire
    Commented Apr 30, 2018 at 21:56
  • 4
    Never got a final answer? You didn't think that -20 on your question was a pretty resounding no from the community for reopening? Commented Apr 30, 2018 at 23:43
  • 2
    @rekire: You could've told the users to stop posting gifs. You could've kicked them if they continued. The tools were there.
    – Cerbrus
    Commented May 1, 2018 at 6:59
  • 1
    I agree with @Cerbrus. The tools were present, but there was never any enforcement of it. Commented May 2, 2018 at 8:42
4

My brain requires rules to be black-and-white in order to have any hope of being retained. However, the rules of life (especially surrounding interpersonal contact) are often subjective, so in these cases I try to find a comparative simile that I can instead remember.

I've been using chat regularly for only a couple weeks, and I found myself forgetting that I was still on Stack Overflow, which I now attribute to the different "vibe" of the live 1:1 venue (as opposed to a forum where "paying attention to what everyone says", is the whole point).

At one point I realized I had posted a piece of personal data that should not have been shared. The member I shared the information with wasn't the issue -- the problem was that within 2 minutes , it was now a permanent record in that chatroom (and potentially permanently Google-able) and thus becomes an indelible mark on both me and on the site we all work so hard to keep "a step above" other forum sites.

I contacted a mod to remove the item and while waiting I looked through my own history and was shocked at myself — politics were only part of the off-topic and/or inappropriate things I has brought up, intermingled with programming talk.

Anyhow in the end, a couple patient mods heeded my request to delete the room entirely, but the process made me realize that the "behaviour rule" actually is very black-and-white (for me, anyhow):

We must carry ourselves the way we would when in a Workplace.

(i.e. behave the same way we would in an office environment, or perhaps, a school.)

To me, this means:

  • Small talk is not only allowed, it's encouraged. It builds comradery and makes the day more enjoyable. Humans are social creatures.
  • By all means discuss "what you did over the weekend" or the upcoming episode of a favourite show.
  • Joking, laughing, maybe an occasional silly post isn't a big deal — as long as we remember that these rooms are open and that anyone maybe within "earshot" currently, or in the future.
  • The important thing in a workplace (or our chatrooms) is to remember the primary purpose of the room. We're all here to do a job, of sorts (or you could compare to post secondary school would be an alternate comparative example.)

Just like a workplace or school, if we want to venture outside of what's appropriate, we're perfectly able to -- elsewhere. Make plans to "meet" somewhere after "work" for the electronic equivalent of a pint of beer. That's a more appropriate place to "unwind and let loose", get raunchy, misbehave, and then recover in time to return to "work" the following day, reassured that "work" and "play" have been kept safely distanced, thereby protecting yourself and your "co-workers" from embarrassment or worse.

I'm generally not great with similes and I've pushed this one so I'm not sure if I'm properly communicating my point, but am I on the right track?


Related links for tips on how to carry one's self, etc:


After posting a chat message, we unfortunately we only have 2 minutes to "take back" something we say if we end up regretting it. I believe this is not enough time for us to: Walk away from computer → realize what we said/did/posted → run back to the computer → find the message → edit/delete it. I have a meta question proposing an increase (or alternate rules like "room owner can always delete" for this reason.

1

There are rooms that regularly and consistently violate this Meta post. For ages, nothing has been done to either those rooms or the posters there who get flagged.

Is this new policy going to be a universal rule?

Or only when the feelings of people who think like SE management politically get offended?

10
  • 4
    The truth hurts, I guess
    – DVK
    Commented May 1, 2018 at 19:05
  • 7
    Or it's a separate discussion that should be raised elsewhere instead of distracting from this one. Or of course, it's not truth at all.
    – Nij
    Commented May 1, 2018 at 19:49
  • 3
    @Nij - did you read the original blog post? if someone feels offended, that should be enough. But as I said in this answer, that only works if the powers that be agree with the person being offended
    – DVK
    Commented May 1, 2018 at 20:17
  • 8
    This isn't a new policy. Unfortunately, some things came to pass that required us to reiterate our stance very plainly, but this has been our stance for years. But thanks for sharing.
    – user50049
    Commented May 2, 2018 at 12:30
  • 3
    @TimPost - the policy isn't new. Selectively applying it isn't new either. The question is, is the whole "They’re just what the feeler is telling you. When someone tells you how they feel, you can pack up your magnifying glass and clue kit, cuz that’s the answer" new line of thinking going to apply to everyone's feelings or (as was the case previously), only to people whose feelings are approved.
    – DVK
    Commented May 2, 2018 at 14:18
  • As a "victim" of this policy, I can tell for almost sure that yes, it's going to be universal. Commented May 3, 2018 at 19:50
  • 1
    @ShadowWizard - thankfully, it will be easy to test. I know which room will serve as a litmus test of whether the policy is universally applied.
    – DVK
    Commented May 3, 2018 at 22:01
  • @DVK Tavern? :) Commented May 3, 2018 at 22:32
  • @ShadowWizard - I don't recall ever going there, to be honest
    – DVK
    Commented May 4, 2018 at 1:05
  • 2
    @DVK - yes it does seem to be very painful for a lot of people. I suppose my answer will soon have as many downvotes as yours!
    – Vector
    Commented May 4, 2018 at 15:58
-4

Forgive me if this has already been said: I can't justify reading 23 thousand words to find out.

But, doesn't this just give the trolls another weapon to cause trouble?

"Hey, Joe! Let's see if we can write a couple of bots to get an SE chat room deleted!"

6
  • 2
    Easy enough to track down the bots, find who created them, suspend/nuke their accounts, and reopen the chat room like nothing happened. No different from voting rings and bad sock puppets on the main sites. People can always game the system, the punishment resulting from this keeps them at bay. Commented May 14, 2018 at 11:16
  • 1
    No. There seems to be this misconception that a one-off event would trigger a room deletion... but this is extremely unlikely. The important thing to take away from this post is that it's about a failure of the users in the room to moderate content by not flagging and not telling people to stop and attacking people who do try to get it to stop. Trolling doesn't really fall in here. If one user is a problem, the user will be dealt with. If the room culture fails to apply "Be Nice", then the room will be deleted.
    – Catija
    Commented May 14, 2018 at 12:15
  • 1
    You don't really need a bot to cast said flags anyway, anyone can do a search and cast a few out of context flags to restart all the drama.
    – Kevin B
    Commented May 14, 2018 at 17:29
  • I wasn't suggesting a bot to cast flags. Flagging requires intelligence, which bots do not have. I was thinking that trolls might write bots to make things messy, trying to get the room deleted.
    – WGroleau
    Commented May 14, 2018 at 19:59
  • And is it a "one-off event" if one or more (censored) runs bots to clutter up a chat room with multiple garbage messages?
    – WGroleau
    Commented May 14, 2018 at 22:04
  • If it's obvious that the room is being targeted, yes... This happens even now... hasn't caused a room to be deleted yet... more often, they get put in gallery mode (permanently or temporarily) and users have to request access to chat at all. Keeps the trolls out and the users don't have to deal with it.
    – Catija
    Commented May 14, 2018 at 22:13
-5

While we're talking about chat flags, we also need a way to prevent misuse of the flags for simply advertising a message to every 10k user on the network.

12
  • 2
    I can't say that I see this happening with sufficient frequency to concern me...
    – Catija
    Commented May 3, 2018 at 18:50
  • Alright, but at least have that thought on the list of potential problems to address, so if it becomes a problem we won't need to wait for the next dramatic review of the rules like this to be able to implement it. Since we're unearthing all the long-standing issues with chat, I thought I'd bring up everything there is that matters. Commented May 3, 2018 at 22:09
  • The easy solution there, which has been tossed around for a while, is just making all flags mod-only. No 10K flags at all. It's not been widely accepted as a solution, but it's an option. :)
    – Catija
    Commented May 3, 2018 at 22:11
  • I think flags need to go through 3 tiers of visibility: room owners, local site moderators (for the room's parent site) and then all mods if the flag isn't dealt with by each of the groups in order. Give them some time like 2-5 minutes and then level it up if there is no action. If regular users flag, it goes to room owners first. If room owners flag, it goes to local mods. This also solves the problem of needlessly bothering all global mods and 10k users. And the flag history should be left for moderators to review in case room owners abuse de-flagging. How does this plan sound? Commented May 3, 2018 at 22:18
  • eh, for that to work, there would need to be different rules in place for rooms that haven't been open long enough, or that don't get a certain amount of traffic. Otherwise, someone being abusive in a room for 2 people won't get seen by anyone but the owner of that room assuming the abusive user is the owner of that room.
    – Kevin B
    Commented May 4, 2018 at 15:33
  • @KevinB it's already in these rules I just described. If it's not dealt with any tier in 5 minutes, it goes a tier higher. Automatically moves up if there are no applicable members of a tier in the room (no room owners → straight to local mods. No local mods → straight to global mods) Commented May 5, 2018 at 1:54
  • right, so, scenario time. You've been invited to a room. There are two people in the room. One of them is a room owner. If the non-room owner is abusive, and you flag it, that room owner, tier 1, will be the one to handle it. regardless of whether or not they were involved. The same goes for a more populated room. Say, someone flags another room owner in the javascript chat. The other room owners can decide that since it's a room owner, they'll let it slide and just decline it. No one else saw the flag because once again, Tier 1 dealt with it. the tier system works great when there's no abuse.
    – Kevin B
    Commented May 5, 2018 at 1:58
  • ... but so does the current system.
    – Kevin B
    Commented May 5, 2018 at 1:59
  • So that's why there's going to be a history of flags raised and resolutions chosen for mods to review. If the room owners are found abusing the flag system, their flag reviewing rights may be revoked (across all rooms), basically marking them as untrustworthy reviewer, placing them in tier 0, same as regular users. The flags will then go past them to local moderators if there are no fit room owners with review privileges left. Commented May 5, 2018 at 2:06
  • @Catija: at least one mod has stated he/she goes to the chat room in response to a flag. If he/she is the only one who can flag an entry, then neither will ever happen.
    – WGroleau
    Commented May 14, 2018 at 22:08
  • 1
    @WGroleau I don't follow. Everyone can flag... it's a matter of who sees the flags.
    – Catija
    Commented May 14, 2018 at 22:11
  • Yes, but you seemed to suggest making flags mod-only, which would lead to not having flags if a moderator only visits to respond to a flag.
    – WGroleau
    Commented May 15, 2018 at 8:34
-8

I predict some day that Stack Exchange will be up for a notable prize in some social studies area. And they will look at this decision as the reason that SE does not deserve one.

The disturbance in chat is trying to tell you something very important.

I do not know what that something is. But I hear it screaming out loud and clear. And instead of listening to the early warning signals, the choice is being made to turn off the alarm and ignore the problem.

The problem is not going to go away. It is a virus in the system that is man. Fix the problem. Do not silence the alarm.

4
  • We have been talking about the general tone in The workplace chat for several months now.
    – Chad
    Commented May 3, 2018 at 18:46
  • Brilliant remarks. Do not silence the alarm - but that is so much easier, and it gives those in charge the feeling that they've solved something. It also allows them to believe that they are gods, with the the power to fix the virus in the system that is man.
    – Vector
    Commented May 4, 2018 at 22:14
  • Trolls don't troll for an important reason. They just troll without reason. Commented May 6, 2018 at 15:46
  • 1
    @JohnMiliter no one does something for no reason. There is always a catalyst to action. Its the fabric of our reality.
    – Chad
    Commented May 9, 2018 at 23:01
-13

We're More Aggressively Enforcing Self-Moderation In Chat

Self moderation? Surely if there's an issue, moderators aren't doing their job. Surely you then need to improve the moderation rather than anything else.

While the timing of this post coincides with us expressing some serious concerns around how we're not doing a good job of helping and guiding Stack Overflow to remain a welcoming place for everyone, this is something that's been weighing heavily on our minds for quite some time, and applicable to any site that's wired into chat (AKA, all of them).

Good stuff.

Sometimes you have problems that stay dormant for months, heck, even years, but when they flare up — it's really ugly. I'm going to make a very firm statement that I'm super proud of 97% of our chat rooms that remain some of the safest places to hang out and 'talk shop' on the Internet; you folks are doing an amazing job of helping us prove that groups of responsible people tend to bring out the very best in one another given loose rules that are often open to interpretation (see linked and related posts, too).

Sounds great.

Unfortunately, I need to take a moment and talk about the remaining 3%1. Those of you that regularly use chat have probably noticed that each room has a rather distinct culture. In some rooms, a little off-topic 'fun' is not only permitted, but also encouraged, and generally serves to make the culture of the room brighter, and the experience of spending time there more rewarding. In these rooms, troll-like behavior or other things that don't reconcile well with our code of conduct are quickly flagged and removed.

Yep, sounds great.

Other rooms prefer to keep the conversation more on-topic, with a focus that's more like a laser than a campfire. Our guidance has always been to essentially go with the flow, as long as that flow isn't something that doesn't appear to belong on our chat system, or doesn't easily come to terms with our code of conduct.

What's not to like?

And that gets us to the hard part. It's terribly difficult and ineffective to write a list of things you can or can't say in chat. First, you just invite a lot of rule-lawyering (the Internet version of but I'm not touching you!! I'm not technically touching you!!!) and second, new people see this oddly specific list of things like "Please don't talk about what monkeys really mean by farting" and wonder what kind of crazy people might be lurking behind the door. I could give more real, concrete examples - but let's not go there.

You don't need to. What underpins morality is do unto others as you would have others do unto you. I know the difference between right and wrong because I know what I wouldn't like done to me.

What positively has to function in order for these rooms to exist with our branding behind them is: Stuff that doesn't belong, or that doesn't reconcile with our code of conduct is flagged. The culture of our rooms must be welcoming above anything else to anyone that puts forward a good-faith effort to join and interact. So, if we see rooms where: Offensive stuff that violates our CoC isn't flagged. Offensive stuff that violates our CoC isn't just allowed (however tacitly, through nobody flagging it), it's encouraged. People are berated, kicked or otherwise harassed for holding a room's culture to our code of conduct. We're going to shut the room down permanently.

Shut the room down permanently? Why? Why not just get some new moderators who will do the job properly? It's wrong to punish everybody because some users have been nasty, and moderators have let them get away with it. Presumably because the latter are the former.

And this isn't the first time we've done this. None of this is new, and as I said earlier, problems sometimes pretend to go away while they secretly find ways to bite you even harder - I put the blame for needing to come here and reiterate all of this yet again squarely on us. But that doesn't absolve folks from the responsibilities that go along with the privilege of using chat. Chat is a great tool, and we are really proud of the caliber of discourse that flows through our systems every day. We want to keep it available because we're really proud of what most folks do with it. But we can't have self-policing break in the face of flagrant violations of our Code Of Conduct, and we'll be enforcing that with calm, steely-faced smiles going forward. Questions? Observations? Anything else? Please leave an answer or a comment. We love leading when it mostly means gently guiding people to do what's good for all of us, and we really don't like it when we need to do it more deliberately. But, we're the custodians of the reputation all of you helped us to build, so we must.

If you've got problems with chat moderation, fix it. Don't close down chat.

22
  • Would the downvoters care to explain why they think it's better to close down a chat room forever instead of addressing the moderation of that room? Commented Apr 30, 2018 at 20:03
  • 7
    I didn't vote either way but I found this answer pretty hard to understand. Commented Apr 30, 2018 at 20:05
  • 3
    The people in that room do a large part of the moderation by flagging messages which do run afoul of the site's rules. If that's not happening, then the people in the room don't care about the site's moderation, so closing it down dramatically slows down the tide of tone-deaf Meta rants about their chat room suddenly being moderated.
    – Makoto
    Commented Apr 30, 2018 at 20:05
  • 8
    There is no indication in the question that a variety of steps wouldn't be taken first - such as removing problematic users from chat entirely. If one user is the problem, they're not going to punish the entire room. I don't see that changing. But, if the room is putting up with that one problematic user by not kicking or flagging them, then the "culture" of that room is problematic and needs to be addressed.
    – Catija
    Commented Apr 30, 2018 at 20:07
  • @Makoto : chat flagging is broken. I've been suspended without warning for flagging offensive comments in chat. Commented Apr 30, 2018 at 20:14
  • @Catija : that's not how I read the OP or the associated history. Do note that Tim Post didn't say the room would be suspended. He said we're going to shut the room down permanently. Commented Apr 30, 2018 at 20:16
  • 6
    You've missed reading the bullet points for when that would be the solution. None of those bullet points are "If single or multiple users are a problem but the ROs and other users of the room are flagging or otherwise responding to them to put an end to the behavior"... they specifically outline a room that is systematically failing to enforce the Be Nice policy by not flagging, by not telling people to cut it out, and by harassing visitors who attempt to correct the behavior in the room.
    – Catija
    Commented Apr 30, 2018 at 20:26
  • @Catija : I read the bullet points, They said if we see rooms where x y and z is happening we're going to shut the room down permanently. Not fix the moderation. Commented Apr 30, 2018 at 20:55
  • 1
    If X, Y, and Z are all happening, that particular iteration of the room is already beyond rescue. Again, if users are acting in good faith with the Be Nice Policy by flagging problematic behavior, this will not be a problem... which is what I said in both of my prior comments.
    – Catija
    Commented Apr 30, 2018 at 20:58
  • @Catija : no, if X Y and Z are all happening the room needs a new owner, and the moderators either need talking to or replacing. Closing down the whole room instead of admitting there's a moderation issue is wrong. As for chatflags, see flags in chat are defective by design. Search on chat flags. They don't work. Commented Apr 30, 2018 at 21:09
  • @Catija I'd put forth that I think a lot of (self) moderation should be coming well before the flying of flags: in all the unhappy/unhealthy rooms I've stumbled into in my time here the problem seemed to me that people's words either weren't being used or they weren't having an appropriate effect. Flags are a last resort, not a primary tool, IMO.
    – nitsua60
    Commented Apr 30, 2018 at 23:46
  • 1
    @Catija Well, I don't mean mods as in diamonds. I mean mods as in anyone that should (community) moderate. And by 'mods are asleep scenario', I mean this: Consider a chatroom with heavy fluctuation of activity, such as the H-bar of Physics. Suppose all regulars are out, a new user comes in and a troll is hostile against the new user. Suppose the user doesn't know to flag (or is too taken aback from the hostility). As far as I understand, this post states that is reason to close the room. Commented May 1, 2018 at 12:08
  • 1
    So, inaction from the regulars of a room is apparently a capital offense. Is it the job description of the regulars to moderate a room? Are they responsible for all that happens in the room, even if they aren't involved? Commented May 1, 2018 at 12:08
  • 5
    That's literally the job of a RO, @Discretelizard Good ROs of rooms like that one should be expected to scan the transcript to see if anything untoward happened in their absence that needs attention. Hopefully there's sufficient of them that this is light work and, if not, ask for more. I understand your concern... but if there's a problematic troll hanging out in a room you use regularly enough for this to be a concern... that's a time to ask for help putting the troll on ice. But I have seen ROs lose their status for refusing/failing repeatedly to act on r/a content.
    – Catija
    Commented May 1, 2018 at 12:31
  • 3
    you’re taking this all way too seriously. shog isn’t going to drop in and delete a long-standing room for one offensive action going unmoderated. that certainly could happen on one-off recently created rooms, but i think that’s an entirely different situation.
    – Kevin B
    Commented May 1, 2018 at 13:52
-18

Bravo for the initiative. Some proposals, motivated by the topic sentence of the blog post:

Too many people experience Stack Overflow as a hostile or elitist place, especially newer coders, women, people of color, and others in marginalized groups. [boldface added]

  1. Put either this whole post as written, with a link to the blog post and a link to the survey, on each site's Meta, perhaps as a closed question sending people here. The goal: to get word out to as many people as possible.

  2. Create a gender field for the profile page, with at least four possible answers: male, female, prefer not to say or unstated, and other (optional fill in). Include prominent instructions to all participants about not assuming people's gender. Once an assumption is made, it is very awkward to try to correct it to unstated.

    Or include a field called "preferred pronouns" with possible answers "he/him", "she/her", "they", "other" (fill in).

  3. Explicitly instruct moderators to step in and assist when a participant makes a flag about a gender assumption. You might think this would be an obvious situation where moderator action would be needed, but this might not be obvious to all moderators.

    Writing directly to the community team is not necessarily a solution in this situation, because not all participants know how to do that, and because the community team often has a significant backlog. By the time the community team gets around to looking at it, other participants' thinking about the participant's gender will have already been formed.

    Why is the gender assumption problem important? Because the internet is a place where women find it easiest to drive while genderless. But someone outs her, even if done through a careless error, her gender-ambiguous safety zone, which she may have spent months or years creating within the SE world, can fall apart quite quickly. It's hard to get someone to think of you as undefined if they've already gotten used to thinking of you as she or he.

  4. Hurtful comments can come from the most well-meaning participants. It can happen to anyone. I've seen it happen to good people who are normally considerate; and it has happened to me. But just as we can all let something slip out inadvertently, without realizing how it might come across as hurtful to someone else, we can also all (well, 99.9% of us) learn how to minimize the chances of it recurring. The key is to "take time to teach" (to quote Faber and Mazlish, authors of How to Talk so Kids Will Listen...And Listen So Kids Will Talk). Too often, the participant who committed a foul, and gets their comment deleted, never finds out about the deletion, or doesn't understand why the comment was deleted.

    We need a system whereby the person who wrote a non-nice comment can be led to understand how such a comment could be hurtful to someone else, and what alternative phrasing could be used in that particular situation.

  5. Change the flagging protocol in chat rooms. Currently, outside Chat, one flag is enough to get a moderator's attention, but inside Chat, as I understand it, three flags are needed.

    Keep in mind that it is in the less structured environments where the most hurtful peer-to-peer interactions tend to happen. The SE Chat Room is analogous to the school bus or the locker room. It is in the chat room where negative incidents are the mostly likely to pop up. Let's be ready for them. When they occur, let's help the person understand what was hurtful and how to avoid making someone else feel unwelcome.

    There should be a way for a participant to submit an immediate May Day message within a Chat room. I have seen situations develop in Chat where comments are flying thick and fast, with sloppily written messages and lots of misunderstandings occurring, and one participant suddenly feels completely overwhelmed by hurtful comments, but it's hard to pinpoint exactly which comment or user to flag. A May Day button would give the participant a way of flagging the conversation at that point to say, "Something's gone horribly wrong and I'm getting out of here now. But please take a look at the transcript."

    (Perhaps start thinking about whether it really is workable in the long run to allow chat rooms to permit comments which, on the main question page, would be flaggable as too chatty, obsolete, etc. There are some chat rooms which get incredibly off the focus of their site, and this is very different from the way the rest of SE works.)

  6. In a chat room where problems have occurred repeatedly, institute a policy in which the general free-for-all chat room is only open for business during specific periods when a moderator is present. This would be infinitely more effective than just shutting it down. Again, this comes down to taking time to teach.

  7. Let's make the suspension system more effective in changing behavior and preventing future problems. It's fine to give the participant a cooling-off period -- but at some point, it's important to take time to teach and help the suspended participant understand what went wrong.

    Generally, the site moderators don't have the objectivity needed to do that teaching, because by the time the decision to suspend has been made, the moderators have reached a high level of frustration.

    One possible solution would be to create a volunteer role of mediator, diplomat or ombudsman. This person would help the newcomer who feels unwelcome. This person could, for example, show the timid participant examples of what assertiveness without aggression might look like in the SE world, and do some role-play practice. This person could also help someone who got flagged and suspended learn new ways of responding.

10
  • 20
    I really don't, at all, feel like your first three points have anything to do with the topic at hand. What does people dealing with their gender identity on the internet have to do with self-moderating chat? Especially on a network where gender tends to be one of the least important details about a user...
    – Kendra
    Commented May 2, 2018 at 20:46
  • 4
    Point 4 would be next to impossible to truly implement in any meaningful way, as far as I can see. (If someone proves me wrong, I would be glad to be wrong!) Point 5 would be abused "Oh, they said something off-topic MAYDAY!", point 6 seems to neglect the point that chat is basically the water cooler of SE- You can talk about non-work related topics at the water cooler, though you still have a code of conduct to follow. I plain don't follow point 7 and how it would be implemented.
    – Kendra
    Commented May 2, 2018 at 20:50
  • 1
    @Kendra - Re: relevance of my first three points to the topic at hand: The blog post that got all of this started says, "Too many people experience Stack Overflow¹ as a hostile or elitist place, especially newer coders, women, people of color, and others in marginalized groups." // I get the feeling you haven't come across gender issues at SE -- I'm glad for you. I'm not going to write here about specific gender issues I have seen. My focus in this post is on what to do about them, ... Commented May 3, 2018 at 3:46
  • 1
    ... which is what I understood we were being asked to think about and work on. // Re Point 4: taking time to teach takes time. I would be happy to participate in a proof-of-concept initiative. // Re Point 5: If one wants to do away with a zero-tolerance-style model, which is basically how SE works, at the end of the day, then one needs to take a more graduated approach to discipline, whereby one shortens the leash when a participant's behavior shows that more support and less free rein is needed. Behavior that would trigger a shortening of the leash could be aggressiveness, but it could... Commented May 3, 2018 at 3:51
  • 1
    ... also be a misuse of the Mayday button. Basically, with privileges (such as the privilege to wander around chat rooms freely, or to be able to hit the Mayday button) comes responsibility (including the obligation not to cry wolf). // Re the last point: if you've never experienced a suspension, then to understand this proposal you'd have to use some intense empathy to understand how the suspended person feels when the moderators are too frustrated to listen, respond, and explain with the degree of patience that the suspended person likely needs, to move past their feelings of injustice. Commented May 3, 2018 at 3:58
  • 11
    I don't understand why people need to state their gender identity at all in a more supported manner. We have the profile for that. Also nobody is obligated or can be obligated to use chosen pronouns, as people can @username someone instead in any case. This is a non-issue, since no SE format requires you to even use pronouns at all. Not that I'm particularly averse to chosen pronouns, but SE is a large site with users from all areas of the world and only a very small part of the world has even recognized the use of chosen pronouns at all, so that is an unnecessary source of conflict.
    – Magisch
    Commented May 3, 2018 at 6:43
  • @user5389107 - English is the common language used in almost all SE sites; the pronouns that create problems occasionally for some participants are not the ones the participant uses to refer to him or herself. Those are easy: I, me. Problems can arise in a male-dominated site, if participant X wants to remain gender-undefined, but someone else starts referring to X as he or she. In a male-dominated site, Statement A can be interpreted completely differently by many people, if it is perceived as having been written by a woman, than it is perceived as having been written by a man. Commented May 3, 2018 at 15:41
  • This is especially a problem when gender issues are asked about in a question. Commented May 3, 2018 at 15:43
  • 3
    Random note: your point 5 is not correct - it doesn't take three flags for a mod or someone with 10k network wide to be notified, it only takes one (using the flag button on the right side of a message). If you use the "flag for moderator" by clicking the right side dropdown of a message, then all mods are notified right away (but 10k users are not).
    – user168476
    Commented May 4, 2018 at 3:38
  • @Ash - Glad to hear it! Thanks for clarifying. Commented May 4, 2018 at 3:40

You must log in to answer this question.