Call of Duty includes artificial intelligence tools to moderate hate speeches

Call of Duty, the popular video game, joins the list of platforms that use artificial intelligence tools. It seems that there are no areas where AI is not present. In the case of this video game, artificial intelligence will be used to moderate the players’ voice chat.

Such chat is characterized by many players with so-called toxic behavior, with language that includes discrimination, hate speech, sexism and harmful language.

Artificial intelligence will be used to observe and detect harmful voice chats in multiplayer games. Activision, the distributor of the popular video game, will implement ToxMod, a technology developed by Modulate. This technology uses artificial intelligence and will prevent racist or discriminatory comments in the video game’s voice chat.

 

How will artificial intelligence be used in the Call of Duty video game?

 

The ToxMod technology can listen to voice conversations and their context to detect whether it is hate speech, or harassing language, for example, from white supremacist groups. It should be considered that this tool only detects cases where there is a toxic type of discourse and reports on them, without punishing the players. That is, it reports players who may have violated the rules of the video game’s code of conduct. It will then be up to Activision to determine the measures to be taken to sanction such players and to choose the type of sanction.

According to Activision, this new initiative will strengthen Call of Duty’s moderation systems. It includes text-based filtering in 14 languages for in-game chat text and a player reporting system. It should be noted that toxic language is detected not only in the content of the chats but also in the name of the users.

The ToxMod tool will focus on detecting specific keywords and violations of the game’s code of conduct. Another noteworthy aspect is that players will not be able to deactivate this tool. The only way to avoid it is to disable voice chat altogether.

The tool, which includes artificial intelligence, is already operational in North America in the Call of Duty: Modern Warfare II and Call of Duty: Warzone games. It will be available for the rest of the world with the release of Call of Duty: Modern Warfare III. At launch, the moderation tool will analyze voice chats in English. As it expands to the rest of the world, other languages will be included, although it is not yet clear what they will be.

Activision clarifies that the tool will not take into account trash talk and friendly jokes.

 

Good results of the AI tool in North America

 

According to the developers of the artificial intelligence tool, there were 20% of players who have not relapsed into toxic language after receiving a first warning. Those who did re-offend received sanctions such as restrictions on their account to participate in voice and written chat. This is a tool with a very interesting application of artificial intelligence. According to Activision, utilizing new technology is key and they look forward to working with the community to continue to make Call Of Duty fair and fun for everyone.

No Comments Yet

Leave a Reply

Your email address will not be published.