Have you ever been excited to play a game but instantly greeted with toxic players screaming into your ears? Voicechat toxicity is one of the most common forms of toxic interactions in gaming communities, especially when the competition gets heated amongst teams. From the good old screaming Call Of Duty lobbies to CS: GO matches peppered with racial slurs amongst team members, voice chat toxicity can make or break a player’s experience with a game.
This can ruin everyone’s gaming experience, but unfortunately, toxic culture is rampant especially in online gaming. While a lot of games try to put in various kinds of filters to counter this, toxic gamers always find a way to spread hatred. It was pretty much impossible to manually moderate voice chat channels in-game due to massive amounts of players. But what if it can be moderated automatically? That’s where ToxMod steps in.
"We want to fix voice chat," says @modulate_ai, the developer of ToxMod. It's a tool that uses machine learning AI to moderate voice chat in games. https://t.co/Sp0H2OA7uc pic.twitter.com/G8pJ3eG9QP
— PCGamesN (@PCGamesN) December 16, 2020
Boston-based technology company developed the software ToxMod, the world’s first voice-native moderation service. It uses artificial intelligence to improve online safety and identity protection and improves voice chats in video games. Most importantly, it uses machine learning to detect what players are saying and the tonality of their voice. This means that the software can discern whether a person is just excited or just plain toxic.
ToxMod can help game developers moderate their community in a way that they couldn’t before. With its revolutionary capabilities, developers can now detect toxic, disruptive, or problematic speech in real-time. This helps them take the necessary action to deal with the players in question.
Inclusive gaming means voice chat can't be the wild west – studios need tools to stop bad actors and to coach folks away from toxic behavior. We're excited to share @VentureBeat's intro of ToxMod, our new voice moderation tool which does just that! https://t.co/4gSSjhI6dW
— Modulate (@modulate_ai) December 14, 2020
On top of detecting speech, ToxMod can also have its own index of flagged words and phrases. This way, developers can ensure certain insensitive phrases are completely blocked from the community. Since the software can detect the emotion and tone of one’s speech, it can ensure a safer form of moderation than depending on audio clips out of context.
ToxMod is a genuinely innovative and exciting program for gaming. It may not be perfect now, but it is definitely a step towards a positive change for the online gaming community.
Photo credit: The feature image has been taken by Fredrick Tendong.
Source: Tony Xiao (New York Times) / Modulate