Filippo Menczer: How social media filter bubbles work

0

We tested this hypothesis by studying an algorithm that ranks items using a mixture of quality and popularity. We have found that in general, popularity bias is more likely to reduce the overall quality of content. This is because engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, the engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality article is large enough, it will continue to grow.

Algorithms aren’t the only thing affected through engagement – they can affect people, too. Evidence shows that information is transmitted by ‘complex contagion’, which means that the more people are exposed to an idea online, the more likely they are to adopt and share it. When social media tells people that something goes viral, their cognitive biases kick in and translate into the overwhelming urge to pay attention and share it.

Crowds not so wise

We recently ran an experiment using a literacy app called Fakey. It is a game developed by our laboratory, which simulates a news feed like those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyperpartisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking information from trusted sources and for flagging unreliable articles for fact-checking purposes.

We have found that players are more likely to like or share and less likely to report articles from unreliable sources when players can see that many other users have interacted with those articles. Exposure to engagement metrics therefore creates a vulnerability.

The wisdom of crowds fails because it is based on the false assumption that the crowd is made up of diverse and independent sources. There may be several reasons why this is not the case.

First, due to the tendency of people to associate with like-minded people, their online neighborhoods are not very diverse. The ease with which social media users can get rid of those they disagree with pushes people into cohesive communities, often referred to as echo chambers.

Second, because many people’s friends are friends of each other, they influence each other. A famous experiment has shown that knowing what music your friends like affects your own stated preferences. Your social desire to conform skews your independent judgment.

Third, popularity signals can be played. Over the years, search engines have developed sophisticated techniques to counter “link farms” and other schemes aimed at manipulating search algorithms. Social media platforms, on the other hand, are just beginning to discover their own vulnerabilities.

People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks. They flooded the network to make it seem like a conspiracy theory or political candidate is popular, fooling both the platform’s algorithms and people’s cognitive biases. They even changed the structure of social networks to create illusions about majority opinions.

Reduce engagement

What to do? Technology platforms are currently on the defensive. They become more aggressive during elections by dismantling fake accounts and damaging disinformation. But these efforts can be compared to a mole game.

A different preventative approach would be to add friction. In other words, to slow down the process of disseminating information. High frequency behaviors such as automated taste and sharing could be inhibited by CAPTCHA testing or fees. Not only would this reduce the possibilities for manipulation, but with less information people would be able to pay more attention to what they are seeing. It would leave less room for engagement bias to affect people’s decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement to determine what content they serve you. Perhaps the revelations of Facebook’s knowledge of troll farms exploiting engagement will provide the necessary boost.

Filippo Menczer is the Luddy Distinguished Professor of Computer Science and Computer Science at Indiana University.

© 2021 The Point of Support
Distributed by Tribune Content Agency, LLC.


Source link

Leave A Reply

Your email address will not be published.