Royal society says fake news has been exaggerated


Most people are sane

The UK’s national academy of sciences, the Royal Society, has released a report on what it calls “the online information environment”, challenging some key assumptions behind the de-platform movement conspiracy theorists spreading information about hoaxes on topics such as climate change, 5G and the coronavirus.
The Royal Society has said that while online misinformation is rampant, its influence may be exaggerated, at least in the UK.

“The vast majority of respondents believe that COVID-19 vaccines are safe, that human activity is responsible for climate change, and that 5G technology is not harmful,” the report said.

The report opined that the impact of so-called echo chambers may be similarly exaggerated and there is little evidence to support the “filter bubble” hypothesis (essentially, extremist rabbit’s nests powered by algorithms).

The researchers also pointed out that many debates over what constitutes misinformation are rooted in disputes within the scientific community, and that the anti-vax movement is far broader than any single set of beliefs or motivations.

He said the government and social media companies should not rely on the ‘constant removal’ of misleading content [because it is] not a “solution to online scientific misinformation”.

He warned that if conspiracy theorists were driven from places like Facebook, they could retreat to parts of the web where they were inaccessible.

The report makes a distinction between removing scientific misinformation and other content like hate speech or illegal media, where removals may be more effective.

“While this approach may be effective and essential for illegal content (e.g. hate speech, terrorist content, child sexual abuse material), there is little evidence to support the effectiveness of this approach to scientific misinformation, and approaches to counter the amplification of misinformation can be Further, it is difficult to demonstrate a causal link between online misinformation and offline harm, and there is a risk that Removing content does more harm than good by directing misinformation content (and people likely to act on it) toward more difficult-to-address corners of the internet.”

Instead of distancing, researchers at the Royal Society advocate developing what they call “collective resilience”. Pushing back against scientific misinformation can be more effective through other tactics, such as demonetization, systems to prevent amplification of this content, and fact-checking labels.

The report encourages the UK government to continue to tackle scientific misinformation, but to focus on the society-wide harm that can result from issues such as climate change rather than the potential risk to individuals to take the bait.

Other strategies suggested by the Royal Society are to pursue the development of independent and well-funded fact-checking organisations; tackling misinformation “beyond high-risk, high-reach social media platforms”; and promote transparency and collaboration between platforms and scientists. Finally, the report mentions that the regulation of recommendation algorithms can be effective.

Comments are closed.