Activision Introducing ToxMod System to Combat In-Game Toxicity in Call of Duty
| Tags: Call of Duty
| Author Xavier Geitz
Activision is introducing a new program to combat one of Call of Duty's most persistent, unintended “features”: toxicity.
We're taking the next leap forward in our commitment to combat toxicity with in-game voice moderation launching day-and-date with #MW3
The moderation beta will launch today for North America (English only). Learn more 👉 https://t.co/FsBVNk2LbN pic.twitter.com/Z3O11JMJYF
— Call of Duty (@CallofDuty) August 30, 2023
Longtime fans of the Call of Duty franchise will remember the legendarily toxic Call of Duty lobbies of yesteryear Call of Duty titles. But while modern CoD isn't as toxic as it was back in the day, it's still quite so. Activision is hoping to change that with its newest system.
Activision has partnered with Modulate to implement an in-game AI-powered voice chat modulation system called ToxMod. ToxMod will identify in real-time anything flagged as hate speech, discriminatory language, harassment, and other offenses. Enforcement of Call of Duty's policies will also be real-time by ToxMod.
The new ToxMod system will be live worldwide outside of Asia in Modern Warfare 3 as soon as it launches on November 10. A preliminary beta of ToxMod is now live in Modern Warfare 2 and Warzone in the NA region. Support for ToxMod is currently only available in English, but Activision plans on adding additional languages at a future date.
Combined with the pre-existing text-based filtering in 14 different languages, ToxMod has a lot of potential to make Call of Duty a little more fun and much less toxic.