The AI-powered voice chat moderation service that’s part of GTA Online has been explained by its developer. It has been just over a year since Rockstar Games first started testing this new way of monitoring and banning players in GTA Online.
When the rollout began, it was met with quite a bit of criticism from some vocal players. However, since it was fully enabled across all forms of GTA Online and Red Dead Online in April this year, few people have talked about it since.
Now Modulate, the company behind this software called ToxMod have wrote an article about their partnership with Rockstar Games. Plus, included what they do for GTA Online. Rockstar also re-shared this article on Twitter. It was met with quite a few upset responses from players and non-players alike. As noted above, there has been very little talk about ToxMod until Modulate/Rockstar themselves have mentioned it.
Working With Rockstar Games
Modulate start off their article by saying they are “committed to working closely with the team at Rockstar Games to protect the millions of players enjoying GTA Online against unwanted toxicity and harassment.”
Why It Is Used
As for why they and their systems were brought in by Rockstar, Modulate explain it is because GTA Online’s community is just so big and active much like other games. This means it is always difficult for Rockstar to look after its extremely large player base. GTA V has sold over 205 million copies to date after all.
What Does It Actually Do?
The reliance on player reports by other gamers does not do enough to keep GTA Online a safe space state Modulate. ToxMod is used to help “prevent harassment and toxic behavior”.
When ToxMod was fully launched into GTA Online and Red Dead Online last year, Rockstar Games also issued a new set of Community Guidelines that all players must follow alongside updating their Terms of Service. Modulate has tweaked ToxMod to enforce GTA Online’s Community Guidelines which of course is different from other games.
From there, ToxMod does alerts Rockstar’s GTA Online moderation team in the moment “problematic voice chat interactions” happen rather than waiting for player reporting to come in. Those at Rockstar can then decide to act on the potential conduct violations. This assisted Rockstar Games with enforcing “the guidelines and incentivize positive player behavior within the community”.
Utilizing advanced machine learning technology, ToxMod understands the nuances of player conversations, distinguishing between trash talk and intentional harm and harassment targeted at other players. This means that moderation teams can quickly intervene before those toxic interactions can escalate.
What are your thoughts on ToxMod since it has been part of GTA Online and Red Dead Online? Let us know down in the comments.
To keep up to date with every GTA Online news update, make sure to check back to RockstarINTEL and sign up to our newsletter for a weekly round-up of all things Rockstar Games.
Subscribe to our newsletter!
2 Comments
I would personally beat the crap out of anyone putting voice moderation in video games if I ever met them IRL
No you’re not.