Recommendations for handling difficult situations

I’m in the process of coming up with some basic recommendations for Discourse owners and moderators, on how to make challenging moderator decisions in a sustainable, community friendly way.

Here’s what I have so far:

  1. Share moderation duties with multiple people. Having a single focal point for all mod decisions is fatiguing and can lead to undue flashpoints, and personal attacks. If there’s only one person in charge, is the forum “you”? What if you’re tired, or cranky, or in a bad place? What if you find yourself overly involved in a topic and want to defer moderation to someone else? That’s a dangerous path to be on.

  2. When you must ban, be decisive. Certainly give people another chance if they make a mistake, but then cleanly, decisively ban them. It is not your job to reform this person. You are not obligated to let them try over and over. You can keep the door open by advising them to email you in 6 months or a year if they want to try again. But the impetus to recognize the problem, apologize, and ask politely to try again – that is on them.

  3. Discuss bans, but not too much, and only in the right place. Please don’t celebrate bans or constantly bring them up, and definitely don’t clutter unrelated topics with ban conversation. Keep ban discussion in the “forum feedback” category.

  4. Keep heated moderation talk in private. If things are getting heated, remove that part from the public topic, into a private group message. Maybe even take it offline if needed, to email or voice. At minimum move it to the “forum feedback” category so other topics aren’t cluttered with off-topic moderation stuff.

  5. Take a break. Sometimes people just need to walk away for a while and take a break from the discussion. A short timed closure of a contentious topic, or a day suspension, might be in order. Avoid making hasty decisions in the heat of the moment, and discourage others from doing so as well.

「いいね!」 37

On the forum that I manage, we’ve found that being clear about the reasons why we do the things that we do makes the community more cooperative. If a moderator is perceived to be arbitrary, it causes problems. Of course that means we’re constantly iterating our community guidelines. But it’s not so often that we’re getting bogged down by rules.

When we have to impose sanctions, our default mantra is to set blasters to stun. Serious action like thread closure (our people don’t like to be interrupted) or suspension requires more than one pair of eyes to go through.

「いいね!」 11

That. 110%. Over at Stonehearth any “major” action that is taken is discussed by at least 2 staff and typically ends up being discussed by all active staff before taking place. We’ve taken advantage of the updated blocking feature, to prevent public posting while we chat quietly with a user. If necessary - and thus far very rare - we’ll go for a suspension. If the offense is clear, non-disputable, we’ll block without discussion, but that too is quite rare. The only instances where a single moderator would block without discussion would be an inappropriate username and/or profile image. Even when we had a clear spammer posting pornographic videos, (which were quickly flagged into oblivion), we didn’t delete the spammer until after a (brief) discussion.

Another focus for us is leading by example. Our staff is active in the community, and many users don’t immediately associate us as staff, just as a friendly user. We try to welcome each new user after their first post with a “hello and welcome” reply and be as friendly as we can. If a user posts in the wrong place, we’ll move it, and then post a reply explaining why. 95% of the time, users are appreciative, and this helps foster a friendly and helpful community. With this, most users will seek us out when they need help, and we aren’t some unknown being watching over the forums.

「いいね!」 9

As I reflect on this topic, I think the advice here is still sound. However there is one thing I’d possibly add:

  1. Consider having a site feedback topic summarizing significant moderator decisions and the rationale behind those decisions. We find that transparency in moderation goes a long way toward making communities more sustainable. Over time, you’ll end up with something of an illustrated rulebook… specific examples of the types of behaviors that are not welcome on your site, and the consequences of those behaviors. Related discussions can be spun out into other site feedback topics to improve and enhance your community as well. That’s the goal!
「いいね!」 21

I also wanted to capture this thought here, because there’s something uniquely dangerous about it, and it’s definitely on the extreme end of the moderator challenge level.


I found this exchange on Hacker News fascinating.

I think the relevant guideline puts it well:

Comments should get more civil and substantive, not less, as a topic gets more divisive.

The inadequacy of this guideline and couching most moderation along its lines is why the problem and ‘dynamics’ as tptacek puts it, exist in the first place.

The site selects for and breeds civil, substantive racists and misogynists (along with the hyper-sensitized responses) like a hospital breeds antibiotic-resistant superbugs.

I can see selects for, but breeds seems a stretch. Unless you mean breeds civility within racists and misogynists, which seems beneficial?

Yes, mostly the second thing. It’s the opposite of beneficial - because the guidelines say ‘don’t be a meanie/obvious blowhard’ and most people who get called out for anything are called out for something along those lines, bigots who adapt to these can and sometimes do last on the site for years.

HN’s mods put in a great deal of effort in and are surprisingly successful at containing the far more basic and common human impulse to be a jerk to strangers online. They have rules, they enforce them, they publicly shame rulebreakers, etc. You are explicitly not allowed to be an asshat on HN and everyone knows it. The place would be better if ‘don’t be a bigot’ got the same treatment. All caps users and transgressors against HN’s fundamentalist quotation marks cult are exposed to more public opprobrium than your typical “human biodiversity” sea lion.

I had never thought about it this way, but he’s right – a racist, misogynist, or bigoted line of argument is far more dangerous when it is draped in the robes of overt civility. So to the extent that we are teaching people …

Hey, it’s OK to say racist / sexist / bigoted things, as long as you say them in a civil, substantive manner

… we are indirectly creating much more powerful racists / sexists / bigots, who will become immune to less sophisticated moderators who will only see “well, what this person is saying is kind of morally abhorrent, but they aren’t saying it in a mean way…”

「いいね!」 13

Studies where they gives classes on empathy and “healthy relating” and non-violent communication to people out on the psychopathology spectrum show that they become more effective… psychopaths.

It’s indeed a challenging aspect of being facilitators and moderators in community.

「いいね!」 12

It is worth considering a group-led category for moderation. This way people can opt-in to discussions while the vast majority of users can opt-out.

We created a logged-in users category so it is easy for discussions to move to Trust Level 0 and avoid web crawlers & other anonymous traffic.

「いいね!」 4

ここに、最も厄介なユーザーを追放するという考えを支持する説得力のあるデータがあります。幸いなことに、このようなユーザーは非常にまれなので、追放する労力はそれだけの価値があります。

そして、コンテンツをモデレーションする際には、悪い投稿が単に消えるのではなく、なぜ削除されたのかについての道しるべを残すことの重要性です。

2つの論文は、「Community Interaction and Conflict on the Web」(pdfリンク)と、「Does Transparency in Moderation Really Matter? User Behavior After Content Removal Explanations on Reddit」(pdfリンク)です。

「いいね!」 16

このトピックは素晴らしいです。コミュニティマネージャーに絶対に共有しますね。:grinning:

作成してくれて本当にありがとう!

「いいね!」 7

Boing Boing による、誠実な投稿と不誠実な投稿についての詳細な説明。Boing Boing は世界で最も長く運営されている公開 Discourse インスタンスを所有しています。もし誰かが知っているなら、彼らでしょう!

モデレーションの文脈における「不誠実」とは:

  1. コミュニティガイドラインに違反すること。これは簡単で明白なものです。
  2. コミュニティを煽ることを唯一の目的として投稿すること。これには、ユーザーが言っていることを信じていない可能性が高いと判断できる何らかの方法が必然的に必要です。
  3. 複数のトピックで一貫して「悪魔の代弁者」の立場を取るパターン。私たちは、コミュニティを煽ることを目的として、意図的に反対意見を取る悪意のあるアクターを何人か見てきました。
  4. トピックに対する個人的な関与を繰り返し装うこと。ユーザーが、あらゆるマイノリティの一員であったり、あらゆることを経験していたり、あらゆる問題が個人的なものであるかのように装うことです。私たちは、悪意のあるアクターがこれを意図的に行っているのを見てきました。
  5. 伝統的なシーライオニング(議論に入ってきて、一方的な展開を要求し、人々が自分の立場を明確に論破することを要求し、他のすべてを却下すること)。

モデレーターは、上記のすべての否定的な行動は、返信ボタンではなく、フラグ :triangular_flag: で対応すべきだと指摘しています!返信は奨励であり、「はい、もっとこれをください」と言っているようなものです。

「いいね!」 17

この記事は、主に世界を食い尽くす「一つのウェブサイトに全員」:globe_showing_europe_africa:プラットフォームについて語っていますが、Discourseは本来そのようなものではありません…しかし、それはコンテンツモデレーションの一般的な危険性を、もちろん間違ってはいますが、「すべての発言を許可する!」から始めることで、見事に示しています。

「いいね!」 10

素晴らしい!ソーシャルメディアとその多くの問題に関する、優れた啓発的な記事です。ありがとうございます。:+1:

「いいね!」 6