‘Bot holiday’: Covid misinformation down as social media pivots to Ukraine | Social networks
When David Fisman tweets, he often receives a barrage of hate within moments of posting. Fisman, an epidemiologist and physician, has been candid about Covid and public health.
Even when he tweets something harmless – once, to test his theory, he wrote the banal statement “children are remarkable” – he always receives a flood of angry reactions.
But in recent days, Fisman has noticed an “astonishing” trend, he said. He has published on topics such as requiring vaccinations and improving ventilation to prevent the spread of Covid – and the wrong answers never came. No support for the truck convoy, no call to try Canadian Prime Minister Justin Trudeau for treason.
Others have observed the same phenomenon; those who frequently encounter bots or angry reactions are now seeing a important to file. Misinformation about Covid, which has often trended on social media over the past couple of years, seems to be plunging.
The reasons for this “robot vacation,” as Fisman calls it, are likely varied — but many of them have to do with the Russian invasion of Ukraine.
The information war between Russia and Western countries seems to be heading towards new fronts, from vaccines to geopolitics.
And while social media has proven a powerful tool for Ukraine – with images of Zelenskiy walking the streets of Kiev and tractors pulling abandoned Russian tanks – growing disinformation campaigns around the world could change the narrative of the conflict and how the world responds.
There are many likely reasons for the evolution of online chat. Russia started limit access On Twitter on Saturday, sanctions were imposed on those who may fund disinformation sites and bot farms, and social media companies are more willing to ban bots and accounts spreading disinformation during the dispute.
But something more coordinated may also be at play.
According to emerging research.
“There’s actually been a doubling of New World Order conspiracies on Twitter since the invasion,” said Joel Finkelstein, chief science officer and co-founder of the National Institute for Contagion Research, which maps campaigns in line around public healtheconomic and geopolitical issues.
At the same time, “where before the topics were very diverse – it was Ukraine and Canada and the virus and the global economy – now the whole conversation is about Ukraine,” he said. declared. “We are seeing a seismic shift in the disinformation sphere toward Ukraine entirely.”
Online activity has increased 20% overall since the invasion, and new hashtags have appeared around Ukraine that appear to be coordinated with bot-like activity, Finkelstein said. Users pushing new campaigns frequently tweet hundreds of times a day and may attract the attention of prominent authentic accounts.
“We cannot say for sure that Russia is behind this or directly contributing to the spread of these messages. But it’s pretty hard to believe he’s not involved,” Finkelstein said, with topics strikingly similar to Russian talking points about Ukraine’s Western-controlled President Volodymyr Zelenskiy and the need to dissolve the state. NATO.
A Russian bot farm reportedly produced 7,000 accounts to post false information about Ukraine on social media, including Telegram, WhatsApp and Viber, according to to the Security Service of Ukraine.
And influencers who once protested against vaccines are now turning their backs Support in Russia.
Social media users may see a trending topic and not realize its connection to conspiracy theories or disinformation campaigns, said Esther Chan, editor of the Australian office of First Draft, an organization that researches misinformation.
“A lot of social media users may just use these terms because they’re trendy, they sound good,” she said. “It’s kind of a very clever astroturfing strategy that we’ve seen over the last few years.”
The topics pushed by troll farms and Russian state media are often dictated by Russian officialssaid Mitchell Orenstein, professor of Russian and East European studies at the University of Pennsylvania and senior fellow at the Foreign Policy Research Institute.
In this case, it appears “their orders were changed because the priorities changed,” he said.
Inauthentic accounts are not entirely responsible for genuine hesitations and beliefs. But they amplify harmful messages and make pushback seem more widespread than it is.
“They’ve had huge success with social media platforms,” Orenstein said. “They play a pretty big role and they change people’s perception of what opinion is.”
Fake accounts will frequently link to “pink glue” or low-credibility sites that once carried false stories about the pandemic and now focus on Ukraine, said Carnegie Mellon University professor Kathleen Carley.
“The bots themselves don’t create news – they’re used more for amplification,” she said.
The escalation of stories like these could have far-reaching consequences for politics.
“Right now, we are at the beginning of a war that is the subject of consensus, are we not? It is clear that what Russia is doing is contrary to the moral order of the modern world. But as the war drags on and people get exhausted, that could change,” Finkelstein said.
As “we move into more unknown territory, these narratives will have a chance to develop…it gives us a window into what these themes are going to look like.”
Research around these shifting campaigns is limited, examining thousands of tweets in the early days of an invasion, Carley warned. It’s very early to understand which direction the misinformation is going and who’s behind it – and conspiracies tend to follow current events even when there are no coordinated campaigns.
And “that’s not to say that all the misinformation, all the Covid conspiracy theories aren’t there yet,” she said. “I wouldn’t say the bots are on vacation. They’ve been re-targeted on different stories now, but they’ll be back.
On March 3, the Surgeon General, Vivek Murthy, asked tech companies to spit out what they know about who is behind the Covid-19 misinformation. Murthy wants social media, search engines, crowdsourced platforms, e-commerce and instant messaging companies to provide data and analysis on the kind of vaccine misinformation identified by the CDC, such as “COVID-19 vaccine ingredients are dangerous” and “COVID-19 vaccines contain microchips”.
Disinformation campaigns around the New World Order, however, have more longevity than some other conspiracy theories, as they can quickly morph depending on the target. “They’ll probably be around for a long time yet,” Chan said. “The question for us is whether they would have an impact on people – on real life and also on policy-making.”
It may be too early to tell what emerges from the invasion of Ukraine, but leaders need to understand what terms are emerging in conspiracy theories and disinformation campaigns so as not to inadvertently flag their support for the theories in their public statements, she said.
“They should take note of commonly used terms and try to avoid them,” Chan said.
A global agreement on how to tackle misinformation or disinformation would be key, Carley said.
“Each country does it separately. And the thing is, because we’re all very closely connected across the world on social media, it doesn’t matter if one country has strong reactions, because it’s always going to go from another country’s machines to your machines,” she said. .
Such rules should also have teeth to prevent new campaigns, she said. And educating the public on how to analyze misinformation and disinformation is also important. “We need to start investing better in critical thinking and digital media literacy.”