GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Riot Games Will Record Valorant Voice Chat To Combat Toxic Behavior

It's unclear how Riot's new chat moderation works, but data will be deleted after it's no longer needed.

5 Comments

Riot Games is updating its privacy notice to inform players that across all of its games, it will be using new tools in order to reduce toxicity. The first game to use these abilities, which will record in-game voice chat when a report is filed, is the tactical shooter Valorant.

The studio will store audio data for use when players report those engaging in abusive or disruptive behavior. The data is then scrubbed through to see if anything violates Riot Games' Terms of Service or other policies. It will be made available to the violating player if an infringement occurred and deleted after it's no longer needed. Data will also be deleted if the recording contains no disruptive behavior. A similar system is in place on PlayStation, though there is requires players to record a small section of a chat in order to send it for a moderation review.

Riot said the new voice moderations tools don't involve actively listening to live in-game audio. The studio clarified that it will only listen to audio once a report has been filed. For now, the system will be beta tested in North America before expanding elsewhere around the world. Those concerned about their privacy can always opt for a third-party voice chat app like Discord.

"We're committed to making our games better for everyone who plays them," Riot said. "This is another step toward tackling disruptive behavior across the board, starting with Valorant. Stay tuned for more from our Central Player Dynamics team."

These changes are Riot-wide, TechCrunch reports, meaning all players across all of the studio's games--including League of Legends and Legends of Runeterra--have to accept them. It's unclear when other Riot Games titles will get these updates.

Riot Games didn't specify how these new voice moderation tools will work. According to head of players dynamics Weszt Hart, the technology required to detect behavior violation over voice chat is still in development. Hart hinted that it may lean on machine learning or focus on automated voice-to-text transcription.

Hart also made explicit reference to the "pain in voice comms" that spurred Riot to think of a solution to tackle abusive or disruptive behavior while gaming online.

"Players are experiencing a lot of pain in voice comms and that pain takes the form of all kinds of different disruption in behavior and it can be pretty harmful," Hart said. "We recognize that, and we have made a promise to players that we will do everything that we could in this space."

In other Valorant news, the game received an update that made some substantial changes, like adding a new map, a spectator-like feature called Coach spots, changes to custom game modes, and balance adjustments for various agents.

Got a news tip or want to contact us directly? Email news@gamespot.com

Join the conversation
There are 5 comments about this story