Video game giant Activision has revealed the initial results of its AI moderation tool for Call of Duty that it released last year, including over 2 million accounts that have seen “in-game enforcement for disruptive voice chat” since ToxMod was implemented in August.
Activision debuted the tool in beta ahead of last fall’s launch of Call of Duty: Modern Warfare III and gradually expanded it globally. The company said that ToxMod can identify “disruptive” comments across 14 languages in the games Call of Duty: Modern Warfare II, Modern Warfare III, and Call of Duty: Warzone.
“For disruptive behavior found by our voice moderation, only 1 in 5 users reported the behavior, showing an unfortunate trend that players often do not report in-game instances to our Disruptive Behavior team,” Activision said. “In cases that go unreported, our voice moderation system allows us to take action against players that violate the Code of Conduct,” the company said, adding that active reporting remains critical.
In 2023, artificial intelligence took the world by storm, and several industries (including gaming) looked for ways to leverage the technology. Activision’s parent company, Microsoft, has added generative AI tools to its Office applications and Xbox.
One such update is that we’ve seen a ~50% reduction in players exposed to severe instances of disruptive voice chat since #MW3’s launch.
However, only 1 in 5 players report disruptive voice chat. To encourage more reporting of disruptive behavior, we’ve rolled out messages that…
— Call of Duty (@CallofDuty) January 23, 2024
ToxMod began to pay off almost immediately, the company said, allowing it to evolve its moderation approach. Activision said it had seen a month-over-month reduction in repeat offenders of its online policy since the AI model was introduced.
“Call of Duty saw an 8% reduction of repeat offenders since the rollout of in-game voice chat moderation,” the report said. “Anyone detected to have violated the Code of Conduct will receive actions such as globally muting from voice and text chat and/or restricting other social features.”
Activision also highlighted a 50% reduction in “severe instances” of disruptive voice chat since Modern Warfare III’s launch.
Players who continue to violate the policy face further restrictions, Activision said, adding that it is looking for ways for players to provide “additional feedback.”
For gamers who needed a refresher on what could lead to their account being reported, Activision noted an update to the Call of Duty Code of Conduct that online players are expected to follow.
In “Treat Everyone with Respect,” Activision reiterated its stance on clamping down on toxic speech, including a zero-tolerance policy for bullying and harassment, as well as derogatory remarks related to race, gender identity or expression, sexual orientation, age, culture, faith, mental or physical abilities, or country of origin.
“Call of Duty is dedicated to combating toxicity within our games and will empower our teams to deploy and evolve our moderation technology to fight disruptive behavior, whether it be via voice or text chat,” they said. “We understand this is ongoing work, but we are committed to working with our community to make sure Call of Duty is fair and fun for all.”
Edited by Ryan Ozawa and Andrew Hayward
Stay on top of crypto news, get daily updates in your inbox.
Source: https://decrypt.co/213971/watch-your-mouth-anon-call-of-duty-ai-tool-punishes-millions-of-toxic-players