GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Modern Warfare 3 Uses AI To Listen To Voice Chat And Take Action Against Toxic People

Activision partners with Modulate for a new enforcement system in Call of Duty.

5 Comments

Anyone who has played a Call of Duty online match is surely familiar with disgusting, unsavory, and awful voice chat. Activision is taking steps to improve the online experience for everyone by partnering with Modulate to offer "real-time voice chat moderation" for a number of Call of Duty games, including this year's Modern Warfare III.

Voice chat in Modern Warfare III will use Modulate's AI-powered moderation system called ToxMod to identify and take action against all forms of toxic verbal speech. This is on top of Call of Duty's anti-toxicity team's own measures against text-based toxic chat and usernames. The Call of Duty series also has an in-game reporting system that players can use to help alert Activision to jerks in online spaces.

Please use a html5 video capable browser to watch videos.
This video has an invalid file format.
00:00:00
Sorry, but you can't access this content!
Please enter your date of birth to view this video

By clicking 'enter', you agree to GameSpot's
Terms of Use and Privacy Policy

Now Playing: Modern Warfare III - 'Open Combat Missions' Intel Drop Gameplay Trailer

Activision chief technology officer Michael Vance said in a news release that there is "no place for disruptive behavior or harassment in games ever."

"Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming," Vance said. "This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players."

A beta version of ToxMod will arrive in Modern Warfare II and Warzone today, August 30. The full release will be available in Modern Warfare III at launch on November 10 everywhere except Asia. At launch, ToxMod only works for English speech, but support for additional languages will follow.

Activision said its own anti-toxicity moderation systems for voice and text chat has taken action against more than 1 million accounts since Modern Warfare II launched in October 2022.

It's not just Activision that's trying to clean up its online games. Unity Technologies recently announced a tool called Safe Voice that aims to help fight mean jerks. Before that, Xbox announced and released a tool for players to capture toxic audio and send it to Microsoft.

Modern Warfare III launches on November 10, but there will be multiplayer betas ahead of launch. For more, check out the full Modern Warfare III multiplayer beta schedule and breakdown.

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 5 comments about this story