Dahlia Penna – Red Monkey Collective – Is censorship the only way to stop toxicity?

06 September 2017

Share

Toxicity in online multiplayer video games remains a big problem. There’s no shortage of stories about streamers being harassed, and there’s very few folk who have partaken in online gaming that won’t have witnessed, or even taken part in flaming teammates or opposition. 

There’s a fine line between jesting and just being insufferably vile in chat. The existence of a veil of anonymity that come with hiding behind a Twitch username, or a gamer-tag can turn gamers into horrible specimen. But is there a feasible solution? Is censorship the only true solution? 

Red Monkey Collective, a talent and brand management agency have contributed two guest pieces surrounding the issue of censorship in gaming and esports.

The first of the two part series comes from Dahlia Penna, Esports Associate and more importantly, D.Va main. She talks extensively about her experience with toxicity as a female gamer as well as potential solutions. 

The second part of the series comes from Adam WhyteHead of Esports and will follow shortly. Please note, a full bibliography for attributed sources will be included at the end of the second piece.

20691577 - radioactive isotope sample in a small brown bottle with a warning label held in the fingers of a laboratory scientist in full protective clothing with a mask and breathing apparatus


Team games that involve mics can be a daunting place for those who delight in the fact that gaming usually doesn’t involve ‘real’, personal interaction. Behind a screen, you can be anyone, anything and no one would be the wiser (Ready Player One, anyone?).

Psychologist John Suler refers to this as the online disinhibition effect: the way that the anonymity of cyberspace frees us to say things we wouldn’t dream of in real life – ‘irl’ (Holfeld & Sukhawathanakul, 2017).

“Typically, the first thing I hear is, ‘Are you a girl or a 12 year old boy?’ Having received a fair amount of gender based abuse that follows, I now typically opt with the latter.”

Unfortunately, the ability ‘to be anyone’ often can mean being an asshole.

In gaming/ esports cooperation with your team-mates (“Tracer on point!” “Rush B!” etc etc) is integral to a team’s success (Lin, Sun & Tinn, 2003). It’s hard to get away with being a mute on mic, especially higher up the rankings – communication is key to winning gameplay.

I nearly wrote that people immediately can tell I’m a girl just from my voice, but actually, that isn’t the case. Typically, the first thing I hear is, ‘Are you a girl or a 12 year old boy?’ Having received a fair amount of gender based abuse that follows, I now typically opt with the latter.

Now, 90% of the time, people are lovely, especially in Overwatch, my go-to title at the moment. By lovely, I mean that they don’t mention the fact that I’m not a guy. However, there is a small but violent and vociferous minority that can really ruin the game for women and other online minorities.

Comments range from ‘go and give your boyfriend back his account’ to ‘clever and unique’ sexist remarks to tragic idiocies that I won’t mention here. Yes, I’ve even been asked to make a sandwich – originality at its finest.

However, personally hurtful these remarks can be at times (and they do hurt), misogynistic comments are a far cry from the worst things I have heard people say whilst gaming. Racist, xenophobic comments are strewn throughout all communities and are far worse than the odd ‘get back in the kitchen’ comment. Toxicity exists in all online communities and there is no shortage of reasoning as to why.

The mixture of anonymity and ‘Electronic Screen Syndrome (ESS)’ (Dunckley, 2015) can result in behaviour that people wouldn’t dream of in the real world. Competitive gaming communities are no exception, from MOBAS to FPS’ and everything in between. I have British CS:GO player friends who claim they can ‘speak Russian’.

By this, they mean they can swear in Russian – some even going out of their way to learn Russian swear words simply so they can abuse their Eastern comrades when ‘necessary’ or in retaliation. There is little developers such as Blizzard can do in terms of halting these myriads of hatred and cruel abuse over the mic – and there are usually block and report buttons close at hand.

“Comments range from ‘go and give your boyfriend back his account’ to ‘clever and unique’ sexist remarks to tragic idiocies that I won’t mention here. Yes, I’ve even been asked to make a sandwich – originality at its finest.”

But are these protective tools enough? Even once you’ve blocked someone, there is a chance that you could still get queued into a game with them. Moreover, the widespread feeling in the community is that report requests.

The paramount question that runs throughout the text of this article is if “game publishers have the duty to censor swear words, racist buzzwords and general xenophobia on text chat?” Do they need to adopt a maternalistic approach to the community and ensure that the members are protected from ‘harm’?

Should people even be reported/banned at all for what they say over mic or via chat? There is the fear that I’m sure most competitive gamers have had flit through their mind at least once: “have I taken it too far? Will they report me? Will I get a permaban?!”

Certain people feel that absolute censorship is necessary in the space. One frequent social media user stated:

“Online gaming will be a LOT better when companies actually start banning people for yelling the n-word over and over again in chat.” 

Obviously, it’s not nice to hear racist or sexist insults. However, should a game publisher actively monitor their chat channels? Who is the arbiter of truth? Who decides what is moral, offensive, etc.? It seems to me that online trolls deliberately speak this way so they’ll get the sort of air-time mentioned above. Moreover, I’m no lawyer, but this doesn’t seem like a legally sound, proportionate, or measured response. Especially when you can mute your mic.

Personally, people have threatened to report me for many reasons – from accusation of aimbots (I’ll take that as compliment) to general gameplay decisions (I’ll take that as an insult). Blizzard in particular sets a good example of showing players when they should report and when they shouldn’t after they overhauled the report system in June this year.

We’ve seen clever and limited censorship with Blizzard’s automatic replacement of bad sportsmanship messages like “gg ez” to a range of comedic and self-deprecating phrases such as: “It’s past my bedtime. Please don’t tell my mommy” and “Gee whiz! That was fun. Good playing!”

I don’t believe that this form of censorship disproportionately impedes on anyone’s right to freedom of speech. Moreover, Hearthstone doesn’t allow for chat “in game”, one of its most pleasant features, and allows users to interact via emotes rather than text.

Should there be a consequence for people that are rude? I think so. However, there is a difference between dictator-like censorship, and control of a community. Personally, I feel that game developers do have a responsibility to control truly toxic members of the community. Steven Spielberg put it best, “there is a fine line between censorship and good taste and moral responsibility.

League of Legends developer, Riot Games, have impressed with their attempt to distinguish a few frustrated grumbles/good-natured trash talk from ban-worth vitriol.

“If ‘censorship’ by way of reporting and banning is the only way to stop this… well I’m in.”

In 2011, chat logs from thousands of games each day were recorded as ‘positive, neutral or negative’. What was also interesting is the following year, when a ‘priming’ method (the idea that imagery or messages presented just before an activity can nudge behaviours in one direction or another) was implemented: a warning about harassment leading to poor performance reduced negative attitudes by 8.3%, verbal abuse by 6.2% and offensive language by 11% compared with controls (Maher, 2016).

As such, there are alternatives to complete censorship and banning in general, I would like to see this sort of ‘positive’ reinforcement implemented in other games, not least to see if it had an effect on games outside of the fantasy genre.

“If ‘censorship’ by way of reporting and banning is the only way to stop this… well I’m in.”

After I’d begun writing this article, very aptly, Blizzard announced a second overhaul to features of its report system on PC. Though the actual report system will remain the same, the seriousness at which Blizzard will take each report will allegedly rise considerably. An additional feature that Blizzard is said to implement soon is a notification system that ‘alerts you when a player you reported [is] actioned’. But the overhaul will not only benefit those who have been victims of other people’s abuse/toxicity: people who abuse the report button directly will now be penalised far more seriously than before. This will surely be a good thing, because there are plenty of people in the community who do report in unnecessary circumstances – hopefully this is will lessen the frequency of these cases. It’s certainly positive to see big game developers listening to suggestions from their fans and continually revise their games accordingly.

Though I believe that I have a thick enough skin to combat most things (the block button helps too), I know that many do not. A lot people use video games as an escape from the struggles of being ‘irl’, with many gamers’ forging strong social connections with online friends (Calleja, 2010).

It would be a sad day if some of the more sensitive players were pushed to leave the community simply because toxic players weren’t taught a lesson.

Actions do have consequences, and there is a strong misconception that just because you are behind a screen, it makes it ok for you to say things to people that you wouldn’t to their face. If ‘censorship’ by way of reporting and banning is the only way to stop this… well I’m in.