Recommendations for handling difficult situations

I’m in the process of coming up with some basic recommendations for Discourse owners and moderators, on how to make challenging moderator decisions in a sustainable, community friendly way.

Here’s what I have so far:

  1. Share moderation duties with multiple people. Having a single focal point for all mod decisions is fatiguing and can lead to undue flashpoints, and personal attacks. If there’s only one person in charge, is the forum “you”? What if you’re tired, or cranky, or in a bad place? What if you find yourself overly involved in a topic and want to defer moderation to someone else? That’s a dangerous path to be on.

  2. When you must ban, be decisive. Certainly give people another chance if they make a mistake, but then cleanly, decisively ban them. It is not your job to reform this person. You are not obligated to let them try over and over. You can keep the door open by advising them to email you in 6 months or a year if they want to try again. But the impetus to recognize the problem, apologize, and ask politely to try again – that is on them.

  3. Discuss bans, but not too much, and only in the right place. Please don’t celebrate bans or constantly bring them up, and definitely don’t clutter unrelated topics with ban conversation. Keep ban discussion in the “forum feedback” category.

  4. Keep heated moderation talk in private. If things are getting heated, remove that part from the public topic, into a private group message. Maybe even take it offline if needed, to email or voice. At minimum move it to the “forum feedback” category so other topics aren’t cluttered with off-topic moderation stuff.

  5. Take a break. Sometimes people just need to walk away for a while and take a break from the discussion. A short timed closure of a contentious topic, or a day suspension, might be in order. Avoid making hasty decisions in the heat of the moment, and discourage others from doing so as well.

37 Mi Piace

On the forum that I manage, we’ve found that being clear about the reasons why we do the things that we do makes the community more cooperative. If a moderator is perceived to be arbitrary, it causes problems. Of course that means we’re constantly iterating our community guidelines. But it’s not so often that we’re getting bogged down by rules.

When we have to impose sanctions, our default mantra is to set blasters to stun. Serious action like thread closure (our people don’t like to be interrupted) or suspension requires more than one pair of eyes to go through.

11 Mi Piace

That. 110%. Over at Stonehearth any “major” action that is taken is discussed by at least 2 staff and typically ends up being discussed by all active staff before taking place. We’ve taken advantage of the updated blocking feature, to prevent public posting while we chat quietly with a user. If necessary - and thus far very rare - we’ll go for a suspension. If the offense is clear, non-disputable, we’ll block without discussion, but that too is quite rare. The only instances where a single moderator would block without discussion would be an inappropriate username and/or profile image. Even when we had a clear spammer posting pornographic videos, (which were quickly flagged into oblivion), we didn’t delete the spammer until after a (brief) discussion.

Another focus for us is leading by example. Our staff is active in the community, and many users don’t immediately associate us as staff, just as a friendly user. We try to welcome each new user after their first post with a “hello and welcome” reply and be as friendly as we can. If a user posts in the wrong place, we’ll move it, and then post a reply explaining why. 95% of the time, users are appreciative, and this helps foster a friendly and helpful community. With this, most users will seek us out when they need help, and we aren’t some unknown being watching over the forums.

9 Mi Piace

As I reflect on this topic, I think the advice here is still sound. However there is one thing I’d possibly add:

  1. Consider having a site feedback topic summarizing significant moderator decisions and the rationale behind those decisions. We find that transparency in moderation goes a long way toward making communities more sustainable. Over time, you’ll end up with something of an illustrated rulebook… specific examples of the types of behaviors that are not welcome on your site, and the consequences of those behaviors. Related discussions can be spun out into other site feedback topics to improve and enhance your community as well. That’s the goal!
21 Mi Piace

I also wanted to capture this thought here, because there’s something uniquely dangerous about it, and it’s definitely on the extreme end of the moderator challenge level.


I found this exchange on Hacker News fascinating.

I think the relevant guideline puts it well:

Comments should get more civil and substantive, not less, as a topic gets more divisive.

The inadequacy of this guideline and couching most moderation along its lines is why the problem and ‘dynamics’ as tptacek puts it, exist in the first place.

The site selects for and breeds civil, substantive racists and misogynists (along with the hyper-sensitized responses) like a hospital breeds antibiotic-resistant superbugs.

I can see selects for, but breeds seems a stretch. Unless you mean breeds civility within racists and misogynists, which seems beneficial?

Yes, mostly the second thing. It’s the opposite of beneficial - because the guidelines say ‘don’t be a meanie/obvious blowhard’ and most people who get called out for anything are called out for something along those lines, bigots who adapt to these can and sometimes do last on the site for years.

HN’s mods put in a great deal of effort in and are surprisingly successful at containing the far more basic and common human impulse to be a jerk to strangers online. They have rules, they enforce them, they publicly shame rulebreakers, etc. You are explicitly not allowed to be an asshat on HN and everyone knows it. The place would be better if ‘don’t be a bigot’ got the same treatment. All caps users and transgressors against HN’s fundamentalist quotation marks cult are exposed to more public opprobrium than your typical “human biodiversity” sea lion.

I had never thought about it this way, but he’s right – a racist, misogynist, or bigoted line of argument is far more dangerous when it is draped in the robes of overt civility. So to the extent that we are teaching people …

Hey, it’s OK to say racist / sexist / bigoted things, as long as you say them in a civil, substantive manner

… we are indirectly creating much more powerful racists / sexists / bigots, who will become immune to less sophisticated moderators who will only see “well, what this person is saying is kind of morally abhorrent, but they aren’t saying it in a mean way…”

13 Mi Piace

Studies where they gives classes on empathy and “healthy relating” and non-violent communication to people out on the psychopathology spectrum show that they become more effective… psychopaths.

It’s indeed a challenging aspect of being facilitators and moderators in community.

12 Mi Piace

It is worth considering a group-led category for moderation. This way people can opt-in to discussions while the vast majority of users can opt-out.

We created a logged-in users category so it is easy for discussions to move to Trust Level 0 and avoid web crawlers & other anonymous traffic.

4 Mi Piace

Ecco alcuni dati convincenti a sostegno dell’idea di espellere gli utenti più difficili. Fortunatamente, questi tipi di utenti sono molto rari, quindi vale la pena fare lo sforzo di espellerli:

E quando si modera il contenuto, l’importanza di lasciare indicazioni sul perché i post negativi sono stati rimossi, piuttosto che farli semplicemente sparire:

I due articoli sono “Community Interaction and Conflict on the Web” (link pdf) e “Does Transparency in Moderation Really Matter? User Behavior After Content Removal Explanations on Reddit” (link pdf).

16 Mi Piace

Questo argomento è fantastico. Lo condividerò sicuramente con il mio community manager. :grinning:

Grazie mille per averlo creato!

7 Mi Piace

Una descrizione dettagliata di post in buona fede vs. malafede da Boing Boing, che ha l’istanza Discourse pubblica più longeva al mondo… se qualcuno dovesse saperlo, sarebbero loro!

“Malafede” in un contesto di moderazione è:

  1. Violare le nostre linee guida della community. Questo è un punto facile e ovvio.
  2. Pubblicare qualcosa con l’intento specifico di far infuriare la community invece di parteciparvi. Questo richiede necessariamente un modo per capire che l’utente probabilmente non crede a ciò che dice.
  3. Un modello coerente di assunzione della posizione dell’“avvocato del diavolo” su più argomenti. Abbiamo avuto diversi malintenzionati che hanno fatto proprio questo: hanno sempre assunto la posizione contraria specificamente per far infuriare la community.
  4. Ripetute finte di coinvolgimento personale con gli argomenti: che improbabilmente un utente faccia parte di ogni minoranza, che abbia fatto ogni cosa, che ogni problema sia personale per lui. Abbiamo avuto malintenzionati che hanno fatto proprio questo, inoltre.
  5. Il tradizionale “sealioning” (fare il leone marino): entrare in una discussione, chiedere che vada in un solo modo, e che le persone confutino espressamente la loro posizione, ignorando tutte le altre.

I moderatori sottolineano che tutti i comportamenti negativi di cui sopra dovrebbero essere affrontati tramite segnalazione :triangular_flag: e non con il pulsante di risposta! Le risposte sono incoraggiamento, è come dire “sì per favore, vogliamo ancora di questo”.

17 Mi Piace

Questo articolo parla principalmente di piattaforme “mangia-mondo” “tutti su un sito web” :globe_showing_europe_africa:, per le quali Discourse non è stato realmente progettato… ma fa un ottimo lavoro nell’illustrare i pericoli generali della moderazione dei contenuti partendo (erroneamente, ovviamente) da “tutto il discorso è permesso!”

10 Mi Piace

Wow! Un articolo eccellente e illuminante sui social media e i loro numerosi problemi. Grazie. :+1:

6 Mi Piace