Measures taken against bad behaviour in League of Legends

Here’s an interesting read for community managers and web-anthropologists. It’s about the measures taken against toxic players in online games.

As I read it I was constantly reminded of similarities in Discourse’s design.

For one thing, the game has added an honor system, in which feedback from the community shows up on a player’s profile

Flags.

But more important than just finding the bad behavior is figuring out how to reform it. Lin says that the 1 percent of seriously toxic players have no real interest in changing, but pretty much everyone else will actually stop their negative behavior if you act quickly.

JIT notifications.

For those players who were regularly reported for bad behavior, Lin’s team found that about 50 percent of them didn’t offend again if Riot moderator explained what they actually did wrong, and that number jumped to 70 percent if the explanation included evidence like chat logs.

Authoritative moderator messages and suspensions in favour of permabanning.

Also reminds me of this:

18 Likes

Yeah they’ve experimented a lot with this stuff:

Lead producer Travis George was fearful the game’s reputation was hurting its appeal and stifling growth. “Nobody wants to play a game with somebody who’s mean.” George was quoted as saying. To change player behavior, Riot Games formed a working committee George called, “the PB&J Team, which stands for Player Behavior and Justice Team.” The team included two PhDs, one a cognitive neuroscientist and the other a behavioral psychologist. The group employed concepts similar to Bandura’s and built a variable social reward mechanism into the game.

The system was designed to curtail bad behavior by allowing players to report users for “unsportsmanlike” conduct. Offending players were judged by their peers in a “Tribunal,” which company President Marc Merrill is quoted as saying helps players, “recognize that there are consequences to their actions.” Referrals to the Tribunal are made public and the company emphasizes that, offenders lose games!" making them social outcasts others avoid playing with.

To reinforce teamwork and helpfulness, the company also instituted a rewards system called “Honor” to “identify … pillars of the community.” Honor points became viewable to others players and were awarded or revoked by members of the player’s own, or opposing team. Since implementation in late 2012, Riot Games has reported a dramatic reduction in adverse behavior and a noticeable increase in helpful actions.

I have a hard time keeping track of what they’ve tried last, and a few things they tried they did not keep because it didn’t work well. Of course, experimentation is key… and it’s not just discouraging bad behavior, but encouraging positive behavior alongside that.

These are the two most recent ones, I think:

3 Likes

That reminds me of an insightful article from @codinghorror’s blog.

Here’s the tidbit that speaks the most (emphasis mine):

Thank you so much for posting this, @erlend_sh. Would be interesting to implement some form of these concepts, or something similar that fits my community’s social framework (and the unique conflicts within).

4 Likes

An update, with an emphasis on speed of feedback:

http://www.nature.com/news/can-a-video-game-company-tame-toxic-behaviour-1.19647

“If you look at any classic literature on reinforcement learning, the timing of feedback is super critical,” says Lin. So he and his team used the copious data they were collecting to train a computer to do the work much more quickly. “We let loose machine learning,” Lin says. The automated system could provide nearly instantaneous feedback; and when abuse reports arrived within 5–10 minutes of an offence, the reform rate climbed to 92%. Since that system was switched on, Lin says, verbal toxicity among so-called ranked games, which are the most competitive — and most vitriolic — dropped by 40%. Globally, he says, the occurrence of hate speech, sexism, racism, death threats and other types of extreme abuse is down to 2% of all games.

6 Likes

Sounds like the latest is an honor system?

Everyone starts the new system with level 2 honor, so if you see someone below that it means they’ve already dropped below the baseline

“The big change is that each player can now only honor one teammate; a shift that helps make each honor feel more weighty on both the giving and receiving ends”

Key fragments now ONLY drop through honor. Keys are part of an in-game crafting system and you use them to open chests. Your honor level dictates your key drop rate, so from 2 and upwards it’s the normal rate, if you drop to level 1 it’s a lot worse and if you go to zero (dishonored) it’s nothing.

Basically incentivizing niceness through making “be nice” part of the reward system in the game?

4 Likes

Now we just need to apply this system to real life, where we can rate each live interaction we have with another person based on a five star rating…

On a more serious note, Blizzard is experimenting with (for Overwatch):

Mercer said that the team has “a multitude of initiatives internally” that they’re working towards as ways to address toxic behavior. He cited a pilot program that emails players when their reports have lead to action against offending players, and mentioned plans to integrate that into the game client itself.

6 Likes

More interesting developments here:

and “keep your friends close but your enemies closer” when it comes to cheaters and hackers?

1 Like