Big game companies get DHS help to stop gamers from becoming terrorists
Last December, the United Nations warned of an overlooked but critical “emerging terrorist threat”: extremists radicalizing members of online gaming communities.
Despite great interest in rescuing gamers from such exploitation, experts say a lack of research funding on the subject has put the gaming industry behind social media when it comes to recovery efforts. fight against terrorism. That’s starting to change, though. Over the past week, researchers told Ars that the U.S. Department of Homeland Security has, for the first time, awarded funding — nearly $700,000 — to a research group working directly with major security companies. games to develop effective counter-terrorism methods and protect vulnerable players.
The new project will run for two years. It is run by the Institute of International Studies at Middlebury College, which hosts the Center on Terrorism, Extremism and Counter-Terrorism (CTEC). Vice reported that other partners include a nonprofit called Take This – which focuses on the mental health impacts of gambling – and a tech company called Logically – which Vice says is working “to solve the problem of large-scale online bad behavior”.
The researchers summarized their overall goals for the DHS project as “the development of a set of best practices and centralized resources for monitoring and evaluating extremist activities as well as a series of training workshops for monitoring, detection and prevention of extremist exploitation. in playspaces for community managers, multiplayer designers, lore developers, mechanics designers, and trust and security professionals.
Rachel Kowert, research director at Take This, told Ars that the main goal of the project is to develop resources focused on the gaming industry. Her group’s ambitious plan is to reach large corporations first, then engage small businesses and independent developers for maximum impact.
Alex Newhouse, deputy director of CETC, told Ars that the project will begin by targeting large game companies that “essentially act as social platforms,” including Roblox, Activision Blizzard, and Bungie.
Although funding for the project has just been approved, Newhouse said CETC’s work has already begun. The group has been working with Roblox for six months, and Newhouse said it’s also in “very preliminary” talks with the Entertainment Software Association about ways to expand the project.
Borrowing counter-terrorism methods from social networks
Newhouse said that within DHS, the FBI is increasingly interested in research like CETC’s to combat domestic terrorism — but, to his knowledge, no federal organization has funded such collection of data. Although his project is only funded for two years, Newhouse wants to push the gaming industry within five years to implement the same counter-extremism standards that social media platforms already have.
“I want game developers, especially the big ones like Roblox and Microsoft, to have teams dedicated to counter-extremism in games,” Newhouse told Ars. “Nowadays, we also have to push to be as sophisticated on the games industry side.”
Newhouse plans to use his experience to help tech giants like Google and Facebook organize counterterrorism teams. He says CTEC’s biggest priority is to convince the gaming industry to invest in the proactive moderation of extremist content by “implementing increasingly sophisticated proactive detection and moderation systems” that social networks also use. .
Historically, Newhouse said game companies have relied primarily on gamers to flag extremist content for moderation. That’s not a good enough strategy, he said, because radicalization often works by boosting a gamer’s self-esteem, and people manipulated into viewing this type of online engagement as positive don’t report often not these radicalizing events themselves. By relying strictly on user reports, game companies “won’t detect anything at the initial level of recruitment and radicalization,” he said.
An associate director of the Anti-Defamation League’s Center for Technology and Society, Daniel Kelley, told Ars that online game companies are about 10 years behind social media companies in flagging this issue as critical. .
Limited funding for efforts to counter online gambling extremism
Kowert of Take This first became interested in the link between online gaming communities and real-world violent extremism after encountering a 2019 nationally representative survey from the ADL. It was found that almost one in four respondents “were exposed to extremist white supremacist ideology in online games”. Newhouse said that estimate is “probably too conservative at this point.”
Still, ADL said “evidence of widespread extremist recruitment or organizing in online gaming environments (such as in Fortnite or other popular titles) remains anecdotal at best, and further research is needed before any large-scale assertion can be made”.
Today, the research base remains limited, but it has become clear that the problem does not only affect adults. When the ADL expanded its survey in 2021 to nearly 100 million respondents, the survey for the first time included young players between the ages of 13 and 17. ADL found that 10% of young gamers were “exposed to white supremacist ideologies in the context of online multiplayer games.”
Kowert immediately responded to the 2019 ADL report by pivoting its research and partnering with Newhouse. She told Ars that the reason there is so little research is because there is so little funding.
Kelley told Ars that while it’s good to finally see research receive funding, the ADL recommends that the government inject a lot more funding to nip the problem in the bud. “Now is not the time to back things up with instant funds,” Kelley said. “There’s a lot more the Justice Department needs to do to fund these kinds of efforts.”
The gaming industry remains oblivious
Kowert told Ars that game companies have “legitimately” remained “unaware of the scale of the problem” of extremism on their platforms, mainly because they view themselves as gaming platforms first and as games second. social platforms. Newhouse agreed.
“It’s very, very clear in our conversations with the video game industry that they’re not fully aware of the emerging problem they have on their hands,” Newhouse told Ars.
According to Kelley, it’s not just social media counterterrorism efforts that gaming networks need to embrace; gaming companies could also become safer if there were regulations such as those requiring social media companies to publish transparency reports. The only game company Kelley ever saw release a transparency report was a small company called Wildlife Studios, which released its first report this year.
“2022 is the first time we’ve received transparency reports from a game company,” Kelley told Ars. “And it’s not from one of the majors. It’s not from EA. It’s not from Riot. It’s not from Blizzard.
None of the major online game companies mentioned here immediately responded to Ars’ request for comment. Kelley said Roblox is the only major game company to have a public policy of online extremism.
Part of the problem with video game companies neglecting the issue, Kowert says, may be the large research base that disproves that online video game content has a direct impact on players’ sensitivity to extremism.
The American Psychological Association told Ars that its 2020 report that video games do not incite violent behavior is still its most recent statement. But Kowert says focusing discussions on video game content “hampers the conversation.” There needs to be more focus on how players are socially harmed by extremists during gameplay.
Kelley says CETC’s research is an important first step toward greater government involvement in this issue, but that even bringing the gaming industry up to social media standards may be a low bar.
“I think the social media industry still has a long way to go before we have really solid transparency,” Kelley said.
ADL recommends that online gaming companies go even further than social platforms when it comes to transparency. ADL wants video game companies to audit and include metrics on “extremism and toxicity in games in the Entertainment Software Rating Board’s game rating systems.”
More transparency is exactly what researchers focused on extremism in online gambling communities need, Newhouse said, because research is also limited by publicly available information. However, game companies do not always cooperate enthusiastically with researchers. When Newhouse contacts game companies, he said, sharing data is not their instinct and, in general, they should be scared to cooperate with user protection efforts.
“In all honesty, we usually have to scare companies into listening to us,” Newhouse told Ars.