To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Call of Duty will start to use AI to listen out for hate speech during online mode
Featured Image Credit: Activision

Call of Duty will start to use AI to listen out for hate speech during online mode

Multiplayer gamers, particularly women and diverse groups, often face toxic and prejudiced language in online video games.

Activision is tackling hate speech during online matches in its shooter game Call Of Duty by using artificial intelligence.

The video game publisher said the moderation tool, which uses machine learning technology, would be able to pick up on discriminatory language and harassment in real time.

Machine learning allows AI to learn and adapt without being instructed by humans, instead it uses algorithms and the data it is taught with to recognise patterns.

Activision is rolling out the tool, called ToxMod, in Call Of Duty. It’s made by a company called Modulate.

Multiplayer game players, particularly women and diverse groups, often face toxic and prejudiced language in online video games.

Gamers have long complained about the abuse they face and research by Sky Broadband found that one in 10 female players feel suicidal because of the abuse they’ve endured.

In May 2021, Activision clamped down on hate speech in Call Of Duty. It banned 350,000 Call Of Duty accounts for racism and toxic behaviour.

Activision is cracking down on hate speech.
Activision

Our goal is to give players the tools needed to manage their own gameplay experience, combined with an enforcement approach that addresses hate speech, racism, sexism and harassment,” the publisher said in a press release about its anti-toxicity progress report.

The issue of racist, homophobic and sexist speech is exacerbated in multiplayer games because of the millions of players, with around 90 million gamers playing Call Of Duty each month.

Activision's chief technology officer Michael Vance said the ToxMod addition will make the game ‘a fun, fair and welcoming experience for all players’.

Call of Duty players have often complained about hate speech while playing in online mode.
Activision

Vance said ToxMod allows the Activision’s moderation efforts to be increased significantly by categorising toxic behaviour based on severity. A human will then decide whether action should be taken.

Gamers will only be able to opt out of ToxMod by disabling in-game voice chat, otherwise AI will in fact listen in.

Activision said its existing tools had lead to one million accounts being given communication restrictions. The tools include the ability for game players to report others and the automatic monitoring of text chat and offensive usernames.

The code of conduct for Call Of Duty bans bullying and harassments, including insults that target someone’s race, sexual orientation, gender identity, age, culture, faith and country of origin.

So far ToxMod has been added to Call Of Duty: Modern Warfare II and Call Of Duty:Warzone in the US, with a full rollout expected to begin with the release of Modern Warfare III when it launches on 10 November.

Topics: Gaming, News, Technology, Artificial Intelligence