Photo of author

VALORANT Developers Ponder Further Action Against Toxicity Despite Insignificant Decrease

Riot Games Discusses Recent Findings on Toxicity in VALORANT

Riot Games shared its recent findings on player toxicity in VALORANT in a blog post released today. The survey conducted by the social and player dynamics team revealed that the frequency of harassment in the game remains high. In response to this, Riot Games plans to implement harsher punishments and tighten auto-moderation to address the issue.

Harsher Punishments and Improved Auto-Moderation

VALORANT will be imposing stricter punishments for toxic behavior, as the developers gain more confidence in their detection methods. The game also has automatic detection of offensive words, but Riot Games is working on finding a way to punish players immediately rather than after the game. Additionally, the company is focusing on enhancing auto-moderation to tackle creative ways players try to avoid detection.

Challenges in Voice Chat Moderation

One of the biggest challenges in moderation is dealing with abusive players in voice chat. It is more difficult to detect and requires a more manual process to punish offenders. Riot Games aims to be transparent about its voice moderation efforts, with plans to provide updates in the middle of this year.

Future Plans and Community Engagement

Riot Games has been working on improving voice communication evaluation, but immediate changes to voice chat moderation are not yet apparent. The company recognizes the importance of fixing the harassment issues and is actively seeking solutions. Beta launch for voice chat punishments is planned for North America later this year. In the meantime, players are encouraged to report, mute, and move on when faced with toxicity in chat.

Riot Games, player toxicity, VALORANT, punishments, auto-moderation, voice chat, community engagement