Tech Companies Want to Tackle Harassment in Gaming

Competitive CounterStrike: Global Offensive player Adam Bahriz will probably kill you in-game. He’s so skilled that he landed a contract with Team Envy, an esports organization that’s home to some of North America’s highest-ranking competitive eSports players. Bahriz also just happens to be deaf and legally blind, with a condition known as HSAN 8.

“What do you guys want to do? Just bust out A? I can buy smoke,” Bahriz says. His teammates immediately jump in to mock him and shut him down. “You’re just gonna get blocked,” one of them says. “We know you’re trolling,” another says. “So annoying.” “You’re muted already.”

“OK, I won’t talk, sorry,” he says, resignedly.

Bahriz spends the rest of the game in silence and even starts crying, revealing the very real, potent effects that bullying has on gamers who experience it. It’s everything that’s wrong with toxic gaming culture, where insults are thrown freely, bullying happens regularly, and everything from racism, misogyny, homophobia, transphobia, ableism, and more is fair game. “This incident made me feel super depressed,” Bahriz tells me. “I simply want to have a fun time playing a game—but a speech impediment that is beyond my control makes it difficult.” Bahriz says that eventually the toxic teammates kicked him from the game, and although “most of the time people are toxic, it is rare to actually be kicked from the game. That’s why it was so upsetting. You can mute toxic people, but you cannot prevent your whole team ganging up to kick you for no reason other than a speech issue.” 

In 2017, a Twitch streamer, Nicole Smith, recorded the verbal abuse she received while playing Overwatch.

“Go back to the kitchen,” one teammate said. 
“This is the reason why girls should not do anything,” another chimed in.
“Can you actually go and die?”

Much like Bahriz, Smith was met with a barrage of insults, harassment, and, in her case, misogynistic comments. The abuse that Smith has to endure just to play video games is reminiscent of GamerGate, where women in gaming and games journalism (as well as anyone who spoke up to defend them) endured weeks, months, and in some cases years of harassment, including death threats, doxing, and stalking. This led to changes in the game industry’s response to online harassment, with some game developers and publishers rolling out their own initiatives to combat in-game toxicity, and widespread criticism of many of those same publishers and developers for waiting until people’s lives were in danger to take harassment seriously.

A 2020 Anti-Defamation League survey revealed that 81 percent of American adults experienced harassment in online multiplayer games, compared to 74 percent in 2019, while 70 percent were called offensive names in online multiplayer games, and 60 percent were targets of trolling or “deliberate and malicious attempts to provoke [other gamers] to react negatively.” Overall, there was a 7 percent increase from 2019 to 2020.

For Bahriz, he no longer receives as much abuse as he used to, but when he does, he usually just mutes them and tries his best “to not let the toxicity distract mentally from the game,” he says. For others, however, simply muting doesn’t work, if it’s even available in the game they’re playing. In 2019 another ADL survey found that 22 percent of American adults who were harassed in online multiplayer games stopped playing certain games altogether because of the harassment.

Game Developers Want to Fight Back but on Their Terms

In 2017, Activision Blizzard, Epic, Intel, Microsoft, Twitch, and over 200 other companies formed the Fair Play Alliance to, as its website says, “encourage fair play and healthy communities.” In 2018, Blizzard publicly named 180 Overwatch players banned for toxic behavior, including being abusive in audio chats and deliberately throwing games. Not bad for a game that didn’t even have the option to report abusive players upon its 2016 release. In 2019, Ubisoft issued instant half-hour bans for Rainbow Six Siege players if the company detected slurs in the text chat. Ubisoft’s code of conduct says this includes “any language or content deemed illegal, dangerous, threatening, abusive, obscene, vulgar, defamatory, hateful, racist, sexist, ethically offensive or constituting harassment.” Also that year, Electronic Arts established a Players Council with an inaugural summit at Gamescom in Cologne, Germany.

Riot Games, a company that’s been in the news both for toxicity internally as well as toxicity in its games, is also working to address the issue. In 2012, it introduced the Tribunal System in League of Legends, where players received temporary bans based on their actions and offenses that were deemed unacceptable by other players. (The Tribunal System no longer exists.) In 2016, it published a report in Scientific American that concluded that, based on its study of toxicity, adding in-game tips (among other things) decreased in-game toxicity by 25 percent both in players being abusive in lobbies and in matches containing abuse. Even as recently as April 2021, Riot changed its privacy policy to allow for the capture and evaluation of a player’s voice communications when a report has been submitted about their behavior, with the goal of cutting down on toxicity in voice comms as well as in-game chat.