Much of OHPI’s recent work has been focussed on online hate on social media, but hate speech is also a prevalent issue in online multiplayer gaming. It is therefore important that these games have robust reporting mechanisms that can be utilised by users who encounter hate.
The effectiveness of reporting systems in games can vary. Some are as extensive as mainstream social media sites, in others it feels like an afterthought, and, in a few, reporting is not even an option. For people who are uncomfortable about confronting the perpetrators of online hate, this can leave users powerless to address the problem. In this briefing, we assess the reporting systems of some of the most prominent online multiplayer games.
Marvel Rivals
Marvel Rivals has quickly become one of the most popular multiplayer games. It is a team-based hero shooter that features over 35 characters from the Marvel universe. Each of them has unique abilities which add variety to the gameplay.
Like any popular online multiplayer game, Marvel Rivals has issues with hate-speech. The following screenshot shows a player resorting to racist remarks when they were losing a competitive match.
In the next screenshot, a player calls another player gay when one of their teammates refused to cooperate with them. They then call their teammate insufferable
Text and voice chat are not the only ways to express hate speech in Marvel Rivals. In the following image, a player has a username that sarcastically hints towards racism.
Fortunately, the reporting system in Rivals is pretty good. When reporting content, users have 7 potential reasons to choose from, including text chat abuse, voice chat abuse, negative behaviour, and inappropriate nicknames. Each of these will either lead to more options or to the submission screen.
Upon submitting a report, users receive a notification that the report has been successfully submitted – and are sometimes notified that the reported player already has a penalty in effect. One may also receive a notification afterwards stating that action has been taken against the offender.
You can also report players after a match. The game stores a recording of each session you participate in, allowing you to watch what occurred and ensure you report the right players.
While it is commendable that Rivals has a decent reporting system, the penalty system can cause issues when encountering hateful players. You cannot leave an ongoing match without facing a penalty.
If you don’t want to receive the penalty, you can report and mute another player, but you will have to tolerate their presence throughout the match. If their hate is targeted towards you, then they may resort to obstructive behaviour or refuse to cooperate. For example, Jeff the Land Shark has an ability that can swallow both allies and enemies and transport them elsewhere. Hence, a malicious player can use this to make a teammate constantly lose control of their character and move them away from battles, even if they can’t communicate with them verbally. As a result, the penalty system may incentivise users to endure harassment instead of leaving a match.
Team Fortress 2 (TF2)
TF2 is similar to Rivals; a team-based game where you can play as one of 9 characters who each have their own unique weapons and tools.
For anyone who is sensitive to hate speech, playing TF2 can feel like navigating a minefield. There can be up to 24 players in each session, all of whom can type into the text chat or use their microphone. Hate-speech is not uncommon, with comments oftentimes being homophobic, racist or misogynistic.
The below screenshot shows a TF2 player saying their “favourite colour is negro” after a match had ended.
Lastly, this screenshot shows a player who made a transphobic comment unprompted in response to another player’s text chat. The person they were responding to did nothing to suggest they were trans.
There are only four options to choose as a reason when reporting someone in TF2: cheating, being idle, harassment, and griefing. Upon picking a reason and reporting, there is no feedback given to the player as to whether their report has been successfully submitted. Sometimes, players are notified when action is taken against a player that they have reported.
In addition to reporting, a player can mute someone, which will prevent you from seeing their text chat and mute their voice (though other players may still see and hear them). A player will stay muted in any future sessions you have with them.
Another resort – albeit a confrontational one – would be to “kick”, which removes a selected player from the session.
Allowing anyone to kick would allow any malicious player to boot people as they see fit, so a vote is called to determine whether the majority of players agree with the decision. But this means that whether a vote succeeds depends entirely on the players. Minorities who are targeted with hate-speech may not be able to win such a vote.
Helldivers 2 (HD2)
Helldivers has taken the world by storm, selling over 12 million copies and being the fastest-selling PlayStation game to date. Up to 4 players join together to complete objectives on a large open map while eliminating hundreds of non-player controlled enemies. It is consistently in the top 50 of the most played games on Steam, and it keeps players engaged through regular content updates that add something new to look forward to each month.
While hate-speech may be rare on Helldivers compared to other games, it is still a problem. In the following screenshot, a user reports their experience of harassment due to their accent. One of the players in their session mocked the user by performing an Indian accent. The hate from the player eventually became more targeted and hostile, so the user stopped using their microphone.
By default, your character in Helldivers has a randomised voice, and when they die, their replacement could have a different voice. Some players prefer a particular kind of voice, which can be done through customisation. However, the perpetrator in the following Reddit post kept killing themself in-game whenever their character had a female voice. The poster recalled them saying “Man, why am I always playing as a fucking woman, I can’t play like this.” The session became uncomfortable for the other players as a result.
When you encounter a situation similar to the above, it is natural to want to report the offenders. But the in-game reporting system on HD2 has serious flaws.
When attempting to report a player, from six options, one of which is “Voice Chat.” This is presumably for situations where a player is using their microphone to harass or spout hate speech. Yet, choose it as the option, and you will be greeted with the message, “A player cannot be reported for this action,“ as pictured below.
Another option would be the “Text Chat” option, in cases where users are expressing hate-speech through in-game chat. The problem is that this only works on some players, as illustrated in the following screenshots.
You can only report someone who is associated with an ID on PlayStation Network (PSN). For context, anyone can play the game on one of two platforms: through Steam on a computer (PC) or via a PlayStation 5 (PS5). PC players can play online; nothing additional is required. In contrast, PS5 players require a PS Plus subscription, which means these people need a PSN account. This means that users are unable to report those that are playing the game through Steam for text chat activities. Players are left with no recourse when faced with text-based hate-speech from these users.
PlayStation once announced that they would mandate each PC player to log into a PSN account in order to play. However, they received significant backlash for this controversial decision and canceled their plan as a result. Privacy, security, and availability appeared to be the main concerns against these decisions. PSN has been plagued with outages and security breaches, and people in 177 countries cannot create a PSN account.
When you are able to report someone on Helldivers, the number of potential reasons is extensive and includes hate-speech.
What recourse do players have, in those cases discussed above, where they are unable to report a user for hate-speech? Like in the other games, kicking is an option. Unlike in TF2, you can kick someone as you see fit, as long as you are hosting the session. If you are joining someone else’s session, however, then muting or leaving may be your second best option. However, muting or leaving are not sufficient responses to hateful content, as the developer/publisher would not be alerted to the user’s behaviour. The perpetrators will therefore not face any kind of punishment that could serve to disincentivize further hate-speech.
Deep Rock Galactic (DRG) and Payday 2:
While HD2 at least has a reporting system implemented, other popular cooperative multiplayer games, such as DRG and Payday 2, do not have one.
In DRG, the only recourse available for players exposed to hate-speech is to kick or mute. Like in Helldivers, kicking is only an option if you are the host of the session.
Overkill and Starbeeze, the developer and publisher of Payday 2 respectively, officially advise their players to kick abusive players if you are the host, and add them to your ban list to prevent them from joining your sessions. But, again, this does nothing to notify the platform of the problematic behaviour. Another suggestion made is to report their behaviour on Steam. However, Steam’s support page for this relates to content on Steam, not in-game behaviour. As such, there is no adequate reporting mechanism for those using Payday 2.
Conclusion
As detailed in this briefing, the quality of reporting systems can vary between games.
Competitive games can have great reporting systems, but the penalty for leaving a match may incentivise players to prolong their exposure to hateful content. Furthermore, some of the most popular cooperative multiplayer games can have no reporting systems whatsoever. If developers ever hope to reduce the amount of hateful content on their platforms, they must ensure that games have robust reporting mechanisms that allow users to take action against the problem.
The quantity of hate can vary between games. Players using team-based competitive games can become frustrated with their teammates, and they may vent by spewing hateful remarks. Multiplayer games that have many players in a session may increase the risk of seeing hate speech. Cooperative games tend to have few players per team, so hate may be less likely.
People may play multiplayer games to relax and have fun, and to say hate speech can ruin one’s mood would be an understatement. Exposure to hate-speech is correlated with a number of physical and psychological harms including PTSD and depression.
Fighting hate speech should not be a second job. Confronting the offenders can take a lot of energy, therefore reporting systems should be improved to enable more players to combat it effectively. Furthermore, developers/publishers should be responsible for taking action against perpetrators to discourage such behaviour.