Here’s an interesting read for community managers and web-anthropologists. It’s about the measures taken against toxic players in online games.
As I read it I was constantly reminded of similarities in Discourse’s design.
For one thing, the game has added an honor system, in which feedback from the community shows up on a player’s profile
But more important than just finding the bad behavior is figuring out how to reform it. Lin says that the 1 percent of seriously toxic players have no real interest in changing, but pretty much everyone else will actually stop their negative behavior if you act quickly.
For those players who were regularly reported for bad behavior, Lin’s team found that about 50 percent of them didn’t offend again if Riot moderator explained what they actually did wrong, and that number jumped to 70 percent if the explanation included evidence like chat logs.
Authoritative moderator messages and suspensions in favour of permabanning.
Also reminds me of this: