Content Moderation And Online Gaming

Ashley Mullins
12 min readNov 16, 2020

--

Authors: Madison Neely, Ashley Mullins, and Alex Koonce

Our group, the toxic task force, focuses on the realm of online gaming, and how these free spaces have been filled with toxic attitudes and troublesome behaviors. The voice and type chat features have been riddled with hate speech, sexist language, and various other problematic language. In some cases, issues of doxxing or swatting have made the online gaming culture more problematic. In Ohio there was a video gamer Casey Viner who sent a false report to the police to an opponent’s house, which led to his death. Casey Viner and three other young men swatted the young man over an argument over a call of duty match. Some may argue that some individuals need to just be better players, so that they do not get “flamed” or that they should not be involved in online gaming if they can not take the heat. In response, the way we treat each one another in video games becomes a reflection of how we treat others in reality. When gamers become apathetic to harassment, such tolerance could lead to furthering such harassment in reality; also should place, time or arrangement impact whether or not others deserve not to be harassed? No! Another issue is when new games launch most do not make the effort to prevent harassment in the chatrooms which is argued in this article Games Don’t Do Enough to Combat Toxicity at Launch by Cecilia D’Anastasio. Moreover, we seek to build more conversation around this issue, and understand the potential interventions that could be implemented. Also how content moderation impacts these problematic behaviors.

Identities and Gaming

In regards to identities, we analyze how gender and race impacts users’ experience with online gaming chat features. First gender, the article Deep strike: playing gender in the world of Overwatch and the case of Geguri touches on the hardships of being female in the realm of professional online gaming. Similar to that being female or non-binary in online gaming culture can lead to much harassment in a primarily white male dominated sport. Unlike their male counterparts, many of their achievements have often been ignored which is discussed in Balancing Gender Identity and in a reading by Tanner Higgin. In regards to race, many men of color face in online gaming and how they cope with hate speech by becoming desensitized. Which is what Stephanie M. Ortiz argues in You Can Say I Got Desensitized to It, these men have been met with an overwhelming amount of harassment that they just build an emotional callase. These men of color should not be desensitized to harassment, more needs to be done to disrupt this cycle of harassment. Moreover, An issue with toxic speech in online gaming platforms lies within the community guidelines, which is exactly what author Andrew Zolides suggests in the article Gender moderation and moderating gender. These guidelines should do much more to prevent much toxicity and have the ability to prevent toxic behavior. The twitch community and other streaming service guidelines work to reinforce misogynistic culture.

Content Moderation

Lastly Content moderation, which is the use of volunteer moderators or algorithmic moderators that oversee conversations to make sure it meets the community guidelines. The focus here is on how content moderation impacts voice or type chat features in online gaming. Something that we consider are the forms of content moderation and the possible improvements that have been made. This article Online Harassment and Content Moderation: The Case of Blocklists explains blocklist on twitter as a form of content moderation and how this feature helps to prevent further harassment. Moreover, in this article Gaming Algorithmic Hate-Speech Detection by Jesse Haapoja mentions an algorithm that was programmed using forums, discussions and interviews to detect hate speech. Lastly, toxic speech is not a new issue and game developers have attempted to resolve this issue, but why does toxic speech and behavior still persist?

PS4 Interventions

On October 14th 2020, PlayStation released its 8.0 update that provides more services in content moderation and parental controls. With the update, PlayStation is allowing players to record voice chats and submit them to PlayStation for moderation. However, the recording feature will only be available to PS5 players, meaning that a PS5 player can record their voice chat with those playing on the PS4, but those using a PS4 cannot record the chat themselves. They have also provided a new “mute all mics” feature within the Quick Menu. However, the company’s blog post on these updates did not make it clear whether this feature would mute all possible mics used by one individual player or if it would mute all the mics of all the players in an online party. Parental controls were also updated, allowing parents to control “communicating with other players” and “Viewing content created by other players” in one simplified control. This parental control feature also allows kids to request access from parents for one specific game, which sends the parent the request via email and allows them to accept it if they wish so that the child can access these features. One aspect that PlayStation left out in their discussion is how the voice recording moderation feature will be processed. The company simply suggests that recordings will be sent in to be moderated, but does not explicitly state how these instances will be investigated, who they will be investigated by, or possible reprimandations of those who have violated community standards. PlayStation’s progressive movement toward recording voice chats is a successful step in the right direction. However, because it is limited by the console players have and the investigation process is not detailed, it’s hard to say whether it will be effective in stopping the spread of harassment and hate speech in online communities.

Safe Spaces and Harassment Help for Gamers

Created in 2019/2020 the Gaming and Online Harassment Hotline works to provide a safe space for those who have experienced harassment online to talk about their experiences and how it has impacted them. The group has worked with crisis hotlines in the past to create this hotline. The hotline allows those who have experienced harassment to text the hotline and receive quick responses from hotline organizers to talk about their experiences or request referral to other services, whether that be mental health based or legal based. The website for the hotline (linked below) provides the number for the hotline, a backdrop into why the hotline was established, and an extensive list of outside resources that harassment victims can seek out on their own or by referral from hotline administrators. The hotline is available Monday-Fridays from 4pm-7pm (PST). While this is beneficial to have this hotline in place, it is also limited in the space it can work and is not well advertised amongst the community. Because the hotline is available primarily during the week, it is limiting its access to many younger players who only play during the weekend when they are allowed to by their parents or older individuals who can only participate in gaming when they are off work. In our project we seek to build a community and raise awareness over this issue within the gaming community as well as with large companies. We seek to create a place where our voice and others can be heard. Our project should serve as a way for companies to understand the voices that need and desire for change within online gaming platforms.

Guardians MH is a 501c3 non-profit promoting mental health awareness and resources throughout the gaming community. The organization has many resources that they provide to gamers such as mental health kits, mentoring and peer support programs, and self care plans. They have also created a Discord Bot called RTS Bot that provides mental health resources such as professionals locator, mental health screenings, and crisis intervention. They also have an ambassador program that allows them to advertise their organization by sponsoring gamers to promote mental health awareness and refer other gamers to the service.

Company Interventions

After GamerGate in late 2018 early 2019, Microsoft updated their community guidelines and promised that there would be new content moderation technology released in 2020. However, no further information on changes to content moderation has been released. The mid-2019 post from Phil Spencer promised more safety measures would be put in place in the coming year to protect players. Now, nearly a year and a half later, there is no information on what these safety measures would be, an odd occurrence considering they would work alongside Sony and PlayStation to roll out similar measures. As we know from above, PlayStation just released their plans for further content moderation. The post from Spencer had no details on if they were developing new AI technology to filter through content, just an update to community standards, etc. With the lack of follow through and information on Xbox’s part, it can be assumed that these solutions are either non-existent or far in the future.

SPIRITAI’s Ally is a new AI tech that can be used in online platforms. The website details several aspects of information on the processes the AI works through. For instance, “AI-based automation identifies cause of community disruption, not just the individual words.” However, the website lacks information on a variety of circumstances. Firstly, it lacks information on how the AI works and detects toxic behavior. It is implied on the site that there are uses in gaming, such as the statement, “Ally allows moderators to evaluate behaviour at the conversation and player levels,” but it does not explicitly state whether the AI is made for gaming or for other platforms. The site photos and layouts seem to focus more on a business aspect, with photos of men sitting around a conference table on their laptops, which further makes the use of the AI a little ambiguous. There are also no examples of where this AI has been used, if it even has been used by gaming developers.

What’s our Projects Focus?

While our project does pose similarities to some of these examples, we layout a vastly different concept. Opposed to focusing solely on mental health, harassment, or content moderation, we envelop all of these ideas and more. In our project, we aim to have a combination of these things. From reddit, it can be seen that players have a lot to say about content moderation. And from statistics, we know that harassment is a large problem that often leads to mental health concerns. Our Discord server, which is not yet operational, helps to provide a safe space for discussion on content moderation and to discuss personal experiences with harassment and hate speech. We are doing this by having two separate channels. The first, is the channel focusing on content moderation. Here, we have listed out questions (which have been pinned to the channel) that seek to direct the conversation, and we can step in when needed to further guide discussion. The second, is the personal experiences channel. This channel will have guidelines to keep the safe space operational and focuses on sharing personal experiences with content moderation or harassment and encourages peer support from those who understand each other’s concerns or experiences. Our podcast, seeks to bring awareness to the situation, discussing what we believe through our research and personal stories what may be the most effective form of content moderation and our views of moderation that are currently in place. We also discuss the issue of children in gaming and how game ratings can conflict with what kids are experiencing in game. The website seeks to put all these resources in one space and give a definitive view of our own research on the topic.

We toyed with a lot of different ideas but realized that this problem might be bigger than simply a few interventions. That is what brought us to shift our energy to developing a space to encourage resolution to this problem rather than trying to fix it ourselves. As most problems are solved, they begin with starting the conversation and raising awareness.

Guiding Questions

How did you choose the problem space you wanted to work in?

We all realized we were a part of the same community and all noticed the same problems. The gaming community is so large and diverse yet still is not as inclusive as it could be. Just like any other community, its strongest members are the ones who strive to see the betterment of that community.

How did you come up with the idea?

Ji Su Yoo is student of UC Berkley’s School of Information, she is also the co-founder of Platform abuse. This project branched from Ji Su Yoo guest lecture, who discussed her research with content moderation within online gaming communities; which is were most of our inspiration came from.

Why did you create what you did?

We created what we did because there is not something like this that already exists within our community. While gaming blogs cover stories on the issue, there is not a centralized website for resources surrounding toxicity in gaming.

How did you create it?

We started initially with discussing our firsthand experiences, then went on to discuss situations we have witnessed or our friends have witnessed. We then went on to do extensive research surrounding the problem. Once we had an understanding of the problem itself we went on to compiling our research and findings onto our website/blog. We continually added new elements in order to have a plethora of resources and research to share with the community and ultimately raise awareness.

The Toxic Task Force was created first and foremost as a hub for information regarding the influx of hate speech and harassment in online gaming spaces. Our research consists of isolating the happenings as a problem, providing resources for collaboration, providing information regarding new interventions to suppress the issue, and continuing the conversation for combating this issue. We recognize that at this point in time, online gaming spaces are not safe for all and many people have been marginalized from these communities and have experienced trauma. By constructing our ideas onto a website we have a hub for people to explore our research and develop an understanding for the problem. Through our discord participants can take part in the conversation firsthand through discussion topics we would implement. The Toxic Task Force in summation is a group of individuals with the common goal of raising awareness of toxic hate speech and harassment and continuing the conversation in order to hold developers and gamers alike accountable to foster a more accepting and inclusive community in online gaming.

Lastly, the goal of this project is to build more awareness of this issue and a community for those who have been affected by toxic language.

Here is our podcast and website for more conversation around this topic!

Also feel free to join our discord server for more open discussion!

Follow Our Classmates Projects!

This project came out of the course COM 367 Multimedia Production & Digital Culture at North Carolina State University in fall 2020, taught by Dr. Noura Howell. More posts from our class:

Gender Gap in Pro Sports: Jonathan Hudson and Tommy Delaunay

Toxic Task Force (Content Moderation): Madison Neeley, Ashley Mullins and Alex Koonce

Candid Curly Collaborative: Marissa McHugh & Sandra Garcia

Sexism in Television: Madison Mallory, Chloe Campbell, Jenaye Gaudreau, & Greer Gorra

#NoWomanLagBehind — TJ & Lucas

Misprint — Aaron Kling

Works Cited

“Ally.” Spirit AI, www.spiritai.com/ally/.

Choi, Yeomi, et al. Deep Strike: Playing Gender in the World of Overwatch and the Case of Geguri. 2019, www.tandfonline.com/doi/full/10.1080/14680777.2019.1643388.

D’Anastasio, Cecilia. Games Don’t Do Enough to Combat Toxicity at Launch. www.wired.com/story/videogames-anti-toxicity-valorant-launch/.

Director, Sid Shuman Senior, et al. “PS4 System Software Update 8.00 Launching Today.” PlayStation.Blog, 14 Oct. 2020, blog.playstation.com/2020/10/14/ps4-system-software-update-8–00-launching-today/.

Guardians MH, guardiansmh.org/.

Higgin, Tanner. Online Games and Racism, Sexism, and Homophobia. 11 Feb. 2015, onlinelibrary.wiley.com/doi/abs/10.1002/9781118767771.wbiedcs055.

“Homepage.” Games and Online Harassment Hotline, gameshotline.org/.

JHAVER, SHAGUN. “Online Harassment and Content Moderation: The Case of Blocklists.” 2018.

Lynch, Jamiel. An Ohio Gamer Gets Prison Time over a ‘Swatting’ Call That Led to a Man’s Death. 14 Sept. 2019, www.cnn.com/2019/09/14/us/swatting-sentence-casey-viner/index.html.

Ortiz, Stephanie M. “You Can Say I Got Desensitized to It”: How Men of Color Cope with Everyday Racism in Online Gaming — Stephanie M. Ortiz, 2019. 11 Aug. 1970, journals.sagepub.com/doi/abs/10.1177/0731121419837588.

Sam Schelfhout, Matthew T. Bowers. Balancing Gender Identity and Gamer Identity: Gender Issues Faced by Wang ‘BaiZe’ Xinyu at the 2017 Hearthstone Summer Championship — Sam Schelfhout, Matthew T. Bowers, Y. Andrew Hao, 2019. 2019, journals.sagepub.com/doi/full/10.1177/1555412019866348?casa_token=MZZrDJi-PnAAAAAA%3ABmETwB4wrwlI1zOhw3egMYv75ShOLPTGaY4C3poUZgdOaWWvFgZ9mIelF0eI3By98lRMOp_y_qnd.

Spencer, Phil. “Gaming Must Promote and Protect the Safety of All.” Xbox Wire, 29 May 2019, news.xbox.com/en-us/2019/05/20/phil-spencer-promoting-safety-and-security-for-all/?ocid=Evergreen_soc_omc_xbo_tw_Photo_lrn_5.20.1.

Spencer, Phil. “Gaming Must Promote and Protect the Safety of All.” Xbox Wire, 29 May 2019, news.xbox.com/en-us/2019/05/20/phil-spencer-promoting-safety-and-security-for-all/?ocid=Evergreen_soc_omc_xbo_tw_Photo_lrn_5.20.1.

What Is Doxxing and How Can You Protect Yourself? 12 Oct. 2020, nordvpn.com/blog/what-is-doxing-and-how-can-you-protect-yourself/.

Zolides, Andrew. “Gender Moderation and Moderating Gender: Sexual Content Policies in Twitch’s Community Guidelines.” New Media & Society, July 2020, doi:10.1177/1461444820942483.

--

--

No responses yet