Content Moderation And Online Gaming

Identities and Gaming

Content Moderation

Lastly Content moderation, which is the use of volunteer moderators or algorithmic moderators that oversee conversations to make sure it meets the community guidelines. The focus here is on how content moderation impacts voice or type chat features in online gaming. Something that we consider are the forms of content moderation and the possible improvements that have been made. This article Online Harassment and Content Moderation: The Case of Blocklists explains blocklist on twitter as a form of content moderation and how this feature helps to prevent further harassment. Moreover, in this article Gaming Algorithmic Hate-Speech Detection by Jesse Haapoja mentions an algorithm that was programmed using forums, discussions and interviews to detect hate speech. Lastly, toxic speech is not a new issue and game developers have attempted to resolve this issue, but why does toxic speech and behavior still persist?

PS4 Interventions

On October 14th 2020, PlayStation released its 8.0 update that provides more services in content moderation and parental controls. With the update, PlayStation is allowing players to record voice chats and submit them to PlayStation for moderation. However, the recording feature will only be available to PS5 players, meaning that a PS5 player can record their voice chat with those playing on the PS4, but those using a PS4 cannot record the chat themselves. They have also provided a new “mute all mics” feature within the Quick Menu. However, the company’s blog post on these updates did not make it clear whether this feature would mute all possible mics used by one individual player or if it would mute all the mics of all the players in an online party. Parental controls were also updated, allowing parents to control “communicating with other players” and “Viewing content created by other players” in one simplified control. This parental control feature also allows kids to request access from parents for one specific game, which sends the parent the request via email and allows them to accept it if they wish so that the child can access these features. One aspect that PlayStation left out in their discussion is how the voice recording moderation feature will be processed. The company simply suggests that recordings will be sent in to be moderated, but does not explicitly state how these instances will be investigated, who they will be investigated by, or possible reprimandations of those who have violated community standards. PlayStation’s progressive movement toward recording voice chats is a successful step in the right direction. However, because it is limited by the console players have and the investigation process is not detailed, it’s hard to say whether it will be effective in stopping the spread of harassment and hate speech in online communities.

Safe Spaces and Harassment Help for Gamers

Created in 2019/2020 the Gaming and Online Harassment Hotline works to provide a safe space for those who have experienced harassment online to talk about their experiences and how it has impacted them. The group has worked with crisis hotlines in the past to create this hotline. The hotline allows those who have experienced harassment to text the hotline and receive quick responses from hotline organizers to talk about their experiences or request referral to other services, whether that be mental health based or legal based. The website for the hotline (linked below) provides the number for the hotline, a backdrop into why the hotline was established, and an extensive list of outside resources that harassment victims can seek out on their own or by referral from hotline administrators. The hotline is available Monday-Fridays from 4pm-7pm (PST). While this is beneficial to have this hotline in place, it is also limited in the space it can work and is not well advertised amongst the community. Because the hotline is available primarily during the week, it is limiting its access to many younger players who only play during the weekend when they are allowed to by their parents or older individuals who can only participate in gaming when they are off work. In our project we seek to build a community and raise awareness over this issue within the gaming community as well as with large companies. We seek to create a place where our voice and others can be heard. Our project should serve as a way for companies to understand the voices that need and desire for change within online gaming platforms.

Company Interventions

After GamerGate in late 2018 early 2019, Microsoft updated their community guidelines and promised that there would be new content moderation technology released in 2020. However, no further information on changes to content moderation has been released. The mid-2019 post from Phil Spencer promised more safety measures would be put in place in the coming year to protect players. Now, nearly a year and a half later, there is no information on what these safety measures would be, an odd occurrence considering they would work alongside Sony and PlayStation to roll out similar measures. As we know from above, PlayStation just released their plans for further content moderation. The post from Spencer had no details on if they were developing new AI technology to filter through content, just an update to community standards, etc. With the lack of follow through and information on Xbox’s part, it can be assumed that these solutions are either non-existent or far in the future.

What’s our Projects Focus?

While our project does pose similarities to some of these examples, we layout a vastly different concept. Opposed to focusing solely on mental health, harassment, or content moderation, we envelop all of these ideas and more. In our project, we aim to have a combination of these things. From reddit, it can be seen that players have a lot to say about content moderation. And from statistics, we know that harassment is a large problem that often leads to mental health concerns. Our Discord server, which is not yet operational, helps to provide a safe space for discussion on content moderation and to discuss personal experiences with harassment and hate speech. We are doing this by having two separate channels. The first, is the channel focusing on content moderation. Here, we have listed out questions (which have been pinned to the channel) that seek to direct the conversation, and we can step in when needed to further guide discussion. The second, is the personal experiences channel. This channel will have guidelines to keep the safe space operational and focuses on sharing personal experiences with content moderation or harassment and encourages peer support from those who understand each other’s concerns or experiences. Our podcast, seeks to bring awareness to the situation, discussing what we believe through our research and personal stories what may be the most effective form of content moderation and our views of moderation that are currently in place. We also discuss the issue of children in gaming and how game ratings can conflict with what kids are experiencing in game. The website seeks to put all these resources in one space and give a definitive view of our own research on the topic.

Guiding Questions

How did you choose the problem space you wanted to work in?

Follow Our Classmates Projects!

This project came out of the course COM 367 Multimedia Production & Digital Culture at North Carolina State University in fall 2020, taught by Dr. Noura Howell. More posts from our class:

Works Cited

“Ally.” Spirit AI,



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store