Political (and other contentious topics) moderation strategies

Great discussion point.

Yes it’s supposed to, but as you point out:

I found this talk quite shocking:

And agree with you that US and U.K. media these days is too busy pushing a narrative and opinion that the actual news gets twisted, and as you say, this feeds into the discussions people have on social media.


The problem with this approach (and indeed with the “fair and balanced” aspect of journalism) is the assumption of “good faith” on all sides or the idea that “all opinions deserve to be heard.” The problem is, recent history has proved this to not be true.

Bigotry, racism, classism and other terrible under bellies of human nature proliferate this way, in part because many communities permit “sealioning”, or the “forced debate” to occur. The act of sharing ones opinion should not force others to debate it. These tactics are used very effectively to derail or shut down the conversation. We’ve had disruptive members use these tactics in the past to “take over” discussions with intentionally controversial or inflammatory comments that will knowingly spark debate.

Taken separately, these comments may seem innocent until you examine the comment histories and realize the poster has been repeatedly taking the most inflammatory positions on topics for just this reason. This has sometimes taken more than a year to prove out, leading to untold derailments and damage to the community, all under the guise of letting all opinions be aired and debated.

I wish I could say this was uncommon, but it isn’t. In fact we just had a user admit to this process by creating new accounts after each was banned since 2013. There is an increasing number of folks who seem to enjoy joining forums filled with certain types of communities just to “stir the pot” and post the reactions elsewhere or just for personal gratification. It is important that forums take this behaviour in to account if it is to try and permit controversial or charged topics.


Well stated, and you have actively moderated a site that discusses quite a few contentious topics for quite some time now, so you have plenty of real world examples to draw on. I also found this comic on the Paradox of Tolerance – which I took 30 minutes to source to the original, surprisingly in Spanish by an amazing art infographic account – extremely relevant:

It seems contradictory to extend freedom of speech to extremists who … if successful, ruthlessly suppress the speech of those with whom they disagree.


Sorry for the revival, but I felt I could provide something of value here. Not sure if there’s communities dealing with this, since the toxic political discourse has gone nowhere, so perhaps this can be of help.

Over at my forum, with a very active userbase, we made the decision two years ago to prohibit any political discussion. And ya know what? After a brief, awkward adjustment, everyone is on board. And thankful for it.

After literal years of hosting an “Off-Topic Tavern,” where anything went, we realized two primary things …

  1. You cannot sequester emotion: whatever conversations occurred in the OTT would degrade the quality of community and discussion elsewhere on the site. With a substantial sample size, users—who otherwise are a tight-knit community around a specific topic (football team)—were unable to disentangle their thoughts/emotions of other users who they had sparred with politically. This was backed up by a drop in Active Users, users requesting their accounts to be deleted, and anecdotal comments from dozens of long-time users.

  2. It devoured moderator time: It is simply impossible to have volunteer moderators–hell, even paid moderators—when you have an active community discussing politics. I had two moderators forego their mod duties specifically because of how toxic the politics got on the forum. Go ahead and visit the subreddits out there. It’s either an echo chamber or a complete shitshow. Since moderators are also responsible for maintaining the peace, and furthering the appeal of the community itself, this put them in an impossible spot. Again, forum degradation as a result.

Finally, we’re in a weird territory with politics. And if you have private messaging accessible to your forum, people can use that feature to send any type of abuse—including threats. And when that happens, you as a forum owner now have legal exposure/responsibility to act.

My user base is endlessly opinionated—about everything. And when we made the decision, I made it clear that it was my call. Several long-time users departed in protest. Most returned. And at least two have reached out to me personally since to thank me for the decision, stating that the forum could be “an escape from vitriol of today.” It became a community again.

And after all, isn’t that the point in all of this?


Indeed it is! Thank you for sharing your experience :bowing_man:


Politics is indeed a toxic subject. In my forum there was 2 topics which drifted into politics. It was hard to discuss the topic itself without members being drawn into differences of opinion, and then the topic becoming a fire storm. One member who initiated the political rhetoric stopped visiting altogether… and the forum returned to its prior friendly and cordial state. Politics is no longer discussed. :+1:


I’m glad this approach works for you. Indeed it would probably work for Nextdoor-like groups where the primary purpose of the group isn’t politics, but it doesn’t “solve” anything in the macro sense, it doesn’t make Discourse a better product to say “censor all political discourse if you want to run Discourse” (essentially, I know that’s not your intent here)

Our forum is part of a vast and varied community that discusses everything from art and culture to world politics, vulnerable minorities, and social reform. Our blog is attacked often enough that we are part of both Google’s project shield and Cloudflare’s project galileo in an attempt to stop attacks from the wider internet, and we are making a good-faith effort to discuss the “hard” topics (as @codinghorror described above). Unless the plan really is “if you want to have contentious topics in your community, use another tool”, then I think we need to look for solutions that aspire to accomplish more than just remove safe spaces that these discussions can take place in to begin with. Because NextDoor isn’t one of those places, but I do believe Discourse software is trying to be an option to enable these to exist.


It feels like large social problem that Discourse could help to solve. The whack-a-mole approach to moderation might be useful for getting rid of posts that violate a site’s guidelines, but it seems unlikely to lead to conversations that reduce polarization or actually solve any of the issues that are being discussed.

Maybe part of the difference is that people implicitly understand the rules for certain types of conversations. For example, if someone posts a question on Meta, it’s understood that an appropriate reply would likely be one of the following:

  • asking for more information
  • commiserating
  • suggesting a work around
  • answering the question

I don’t think there’s a similar shared understanding of how to respond to topics about polarizing, emotionally charged issues.

I’m wondering if there would be any value in imposing an explicit structure on these types of conversations. The idea would be to use structures as temporary scaffolding until a community, or the culture as a whole have caught on to the idea of how to have conversations about emotionally charged topics. There are a few possible structures in our current culture:

  • the talking stick: a talking stick is passed around the group. Only the person holding the stick has the right to speak, while all others listen respectfully. Possibly a similar approach could be used with online conversations to emphasize the humanity of the people involved in a particular discussion. (Be mindful of cultural appropriation issues with this though.)

  • formal debate: a resolution is debated between two opposing teams in front of an audience. Discourse seems like an ideal platform for this. If a conversation goes off the rails, a related debate topic could be started. The “audience” could be polled before and after the debate to see if it had any effect on their opinion.

  • steelmanning: the opposite of strawmanning - presenting the best case of the opposition’s argument. A possible implementation would be to require participants in a conversation to make an argument in favour of the opposing point of view before they are allowed to add more posts arguing their point of view to a topic.

There may be other structures that could be used to establish points of agreement and disagreement between nominally opposed groups. For example, while members of my local Facebook group disagree about whether more or less policing will help with what’s going on in our downtown, many on either side agree that we’d like there to be fewer people living on the streets and fewer people suffering from drug addiction. A discussion platform could allow us to establish this point of agreement. Further discussion of the issue could be limited to those members of the community who accepted that their nominal opponents agreed with them on a desirable end goal, while they disagreed on the specifics of how to achieve the goal.

If it’s not clear, I’m guessing quite a lot here. It’s possible that I’m completely off base, or that I’m reinventing the wheel from first principles. There are a couple of Discourse sites I’d like to create that risk becoming quite toxic, so any suggestions about how to have useful discussions around polarizing issues would be appreciated.


I like the idea of tools for difficult conversations.

It might be useful if we could have a topic where only two nominated people could contribute. Perhaps either of them should be allowed to pass the baton to someone of their choosing.

Commonly in public speaking we have time limits. In personal conversations we have conventions of turn-taking. (Video calls suffer here, because latency and half-duplex audio gets in the way.) One of the things which I find very stressful is when someone takes too long, perhaps by repeating themself, or introduces too many points. Responding to an ever-growing list of points by quoted paragraphs becomes unproductive.

So, a limit on post length, for these kinds of difficult conversations, might be useful. Somehow get people to think about their position and boil it down to something digestible. Which is different from getting everything off your chest.

But people do need to feel heard - whether or not they are agreed with. I don’t know if likes, hearts, reactions, are enough for that.

A means of having a facilitator for a conversation might be useful - it might be something like a moderator. Their contributions would need to stand out visually.

There are of course real-life skills, which can be taught, to help with mediation and reconciliation. And de-escalation…


I’ve been wondering if classroom discussions could be used as a model for how to have online discussions about contentious issues. There are a few “active learning” discussion strategies that might be useful: Active learning - Wikipedia. A basic implementation would be for the OP to ask a question instead of having the OP just make a contentious statement that people react to. I’ve seen people get good results from using this approach on Twitter.

A more sophisticated approach would be to have someone fill the role of teacher/facilitator. They could ask specific participants in the discussion to answer a question. They could also divide discussion participants into collaborative learning groups, or pair participants to implement something like the think-pair-share learning strategy.

They sure do! It may be that to give people sufficient attention in an online discussion will require excluding some people from the discussion. I don’t think having 50+ (guessing on the number here) people posting their thoughts on an issue is satisfying for anyone. Dividing participants into collaborative learning groups could help to give meaningful attention to more participants though. An aspect of attention that’s often overlooked is that people need to both give and get attention. A structure that promotes giving attention while you wait your turn to get attention would be useful.