As the eagerly awaited launch of Call of Duty: Modern Warfare 3 approaches, Activision is set to safeguard the way players engage in voice chats, ensuring that toxicity and hate speech find no place within the game.

Real-Time Voice Moderation System

In a 2020 poll of 1,000 US gamers ages 18 to 45 conducted by the Anti-Defamation League (ADL), more than half of multiplayer gamers reported harassment based on their race/ethnicity, religion, ability, gender, or sexual orientation. 

Furthermore, 81 percent of multiplayer gamers have experienced some type of harassment, with the majority also reporting physical threats, stalking, and sexual harassment. To do their part, game developers have already begun addressing this issue.

Activision's commitment to combating toxic behavior has culminated in the introduction of a real-time voice moderation system powered by AI, set to be a central feature of the Call of Duty experience moving forward. 

This innovative step was disclosed in a recent blog post on the Call of Duty website, revealing the company's aspirations to establish "global real-time voice chat moderation, at scale" to combat the rise of toxic speech.

AI-Powered Voice Chat Moderation

The driving force behind this technological leap is the collaboration between Activision and Modulate, resulting in the AI-powered voice chat moderation technology known as ToxMod. 

This sophisticated system, developed by Modulate, is meticulously designed to swiftly identify and clamp down on toxic behavior, ranging from hate speech and discriminatory language to instances of harassment.

Players who do not wish to have their voice moderated can disable in-game voice chat in the settings menu.

Read Also: Starfield Hater Called Out for Mistaking NASA Images for 'Real' In-Game Graphics

ToxMod will not stand alone, however. This cutting-edge innovation complements existing text-based moderation systems that encompass a staggering 14 different languages, demonstrating a concerted effort to foster a globally diverse and respectful gaming community. 

So, no more trash talk in-game? The new approach aids in ensuring compliance with the existing Code of Conduct, which permits trash talk and friendly banter.

What's Next?

But how will this transformative technology be deployed? An initial beta rollout has already commenced, targeting North American players of Call of Duty: Modern Warfare 2 and Call of Duty: Warzone, both of which have now been fortified with the new voice moderation features. 

This preliminary phase will be succeeded by a worldwide launch on November 10, aligned with the release of Call of Duty: Modern Warfare 3-excluding the Asian region, where the technology will not be implemented.

The existing anti-toxicity measures within Call of Duty have resulted in voice and/or text chat restrictions for more than a million accounts. 

Impressively, the data analysis reveals that 20% of players did not re-offend after receiving their first warning. For those who did, the consequences were tangible, including account penalties that ranged from chat bans to temporary account restrictions.

Activision's approach parallels a similar initiative recently undertaken by Microsoft, which introduced an AI voice chat recording tool, augmenting its moderation efforts. 

Stay posted here at Tech Times.

Related Article: Lara Croft Is Coming to Call of Duty's Warzone 2 and Modern Warfare 2, Complete With Her Dual Pistols

 

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion