Pro Esports Player Built An AI System To Help AAA Game Devs Track Toxic Gamers

video game example
Anyone who has ever played an online video game has likely received a nasty message from another gamer. Most games incorporate a report button, but it can sometimes feel as if this feature does nothing. A pro esports player and others have recently created an artificial intelligence system that helps AAA studios better deal with abusive behavior in video games.

GGWP is an AI system that has been developed by esports pro Dennis Fong, Crunchyroll founder Kun Gao, and Dr. George Ng of Goldsmiths, University of London and Singapore Polytechnic. It is supported by Sony Innovation Fund, Riot Games, YouTube founder Steve Chen, Twitch streamer Pokimane, and Twitch creators Emmett Shear and Kevin Lin among other investors. They have so far received $12 million in seed funding.

GGWP was created to address the toxicity issues that nearly every gamer has faced at some point or the other. Fong was personally inspired to develop the AI after years of frustration with toxicity in the gaming community. They noted to Engadget, "You feel despondent because you’re like, I’ve reported the same guy 15 times and nothing’s happened."

ggwp ai player reputation 1
The GGWP API essentially aggregates player data to determine which toxic behaviors happen the most frequently in each video game. The system is "fully customizable" so that one behavior that is acceptable in one video game may not be so in another. Players can be appointed "reputation scores" and developers can assign various responses to both positive and negative behaviors. For example, a notification could pop up if a player’s reputation score has decreased due to using a slur. Fong and his team hope that notifications and other features will discourage undesirable behavior and decrease the number of overall reports. Fong argued that this system could be easily implemented by a AAA studio and may actually reduce the amount of work that studios would need to do.

Fong and his associates spoke with an unnamed AAA studio during early stages of development to determine why toxicity has continued to be such a prevalent issue. The AAA studio contended that toxicity is not its fault and it is therefore not responsible for stopping it. It also insisted that there are simply too many reports and too much abuse for it to successfully manage it.

The team purportedly spoke with several other AAA studies which all claimed that they received millions of reports from players each year and could not deal with all of them. One game supposedly collected more than 200 million player-submitted reports in one year. It is estimated that AAA studios address roughly 0.1% of reports each year and that these studios tend to hire less than ten moderators to tackle the problem.

Fong noted that their team currently has 35 engineering and data scientists who are all devoted to "trying to help solve this problem." The team is hopeful that they can make an impact in gaming. Fong remarked, "The vast majority of this stuff is actually almost perfectly primed for AI to go tackle this problem. And it's just people just haven't gotten around to it yet."

Image of Player Reputation courtesy of GGWP