eSports

COD: 2 Million Players Punished For Toxic Behaviour in 4 Months

on

In a recent Call of Duty update, it was revealed that a staggering two million player accounts were hit with in-game enforcement over toxic behaviour. This revelation came as part of an update on Activision Blizzard’s latest deployment of in-game moderation mechanics in Call of Duty. Specifically, the update discussed the automated voice moderation features rolled out in August 2023. These accounts were said to be punished for ‘disruptive voice chat’ in a Call of Duty game.

The data-driven report published on callofduty.com tells something of an awful story. For many, Call of Duty is nothing without its trash talk and ‘banter’, but it’s obvious just how much of an impact those kinds of communications have on players. For years, Call of Duty has been synonymous with toxicity, particularly in online multiplayer modes like Search and Destroy, which will typically see players launch insults and abuse at one another in almost every match.


Good But Not Enough

call of duty

In the blog post, Activision Blizzard revealed that, as a result of the moderation mechanics, there has been a 50% reduction in the number of players exposed to ‘severe instances of disruptive voice chat’ in the last three months. Not only that but a reduction of 8% was recorded in ‘repeat offenders’ – users that would be punished and then continue to break the rules and remain toxic in-game. Ultimately, two million player accounts were impacted by punitive measures because of toxic communications.

However, there’s still a core issue as stressed by AB. It was said that for all the disruptive behaviour that the AI-driven voice moderation features detected, only 20% of instances were reported by other players. That leaves 80% of the toxic, abusive communications going unreported and slipping through the net. It was said that thanks to new technology, reporting isn’t a necessary component when it comes to action being taken against these malicious operators.

If you’re abusive in-game, these systems will identify that, and you’ll be reprimanded. It’s that simple.

That’s not the end of all things, though. It was highlighted that further features are being deployed over time, with AB’s anti-cheat and moderation teams rolling out fresh mechanics to combat toxic and malicious in-game activities. Many players are claiming that the game has become ‘too soft’, with the usual old-school gamers claiming that ‘today’s players wouldn’t survive their lobbies’, but AB is firm: toxicity isn’t to be tolerated.


For more Call of Duty news, Esports.net

You must be logged in to post a comment Login