Facebook has become Meta – and the company’s dangerous behavior has become evident in 2021: 4 essential reads

Meta, born Facebook, had a difficult year 2021, in public opinion if not financially. Whistleblower Frances Haugen’s revelations, first detailed in a series of Wall Street Journal investigations and then presented in testimony to Congress, show the company was aware of the harm it was causing.

Growing concerns about misinformation, emotional manipulation and psychological damage came to a head this year when Haugen released internal company documents showing that the company’s own research confirmed the societal and individual damage caused by its Facebook, Instagram and WhatsApp platforms.

The Conversation has assembled four articles from our archive that explore the research that explains Meta’s problematic behavior.

1. Addicted to engagement

At the root of Meta’s harmfulness is its set of algorithms, the rules the business uses to choose what content you see. Algorithms are designed to increase business profits, but they also allow misinformation to flourish.

Algorithms work by increasing engagement, that is, by eliciting a response from users in the business. Filippo Menczer of Indiana University, who studies the dissemination of information and disinformation on social media, explains that engagement plays into people’s tendency to favor posts that seem popular. “When social media tells people that something goes viral, their cognitive biases kick in and translate into the overwhelming urge to pay attention and share it,” he wrote.

One of the results is that shoddy news that gets an initial boost may get more attention than it otherwise deserves. Worse yet, this dynamic can be played by people aiming to spread disinformation.

“People who aim to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks,” Menczer wrote. “They flooded the network to make it look like a conspiracy theory or political candidate is popular, fooling both the platform’s algorithms and people’s cognitive biases.”

Read more: Facebook whistleblower Frances Haugen says the company’s algorithms are dangerous – here’s how they can manipulate you

2. Self-esteem of teenage girls on their knees

Some of the most disturbing revelations concern the damage Meta’s social media platform Instagram is doing to teenagers, especially teenage girls. University of Kentucky psychologist Christia Spears Brown explains that Instagram can make teens objectify themselves by focusing on how their bodies appear to others. It can also cause them to make unrealistic comparisons of themselves with celebrities and filtered and edited images of their peers.

Even when teens know the comparisons are unrealistic, they end up feeling less good about themselves. “Even in studies where participants knew the photos shown to them on Instagram had been retouched and reshaped, teenage girls felt even worse about their bodies after seeing them,” she wrote.

“The choices being made inside Facebook are disastrous for our children,” whistleblower Frances Haugen told Congress.

The problem is widespread as Instagram is where teens tend to hang out online. “Teens are more likely to connect to Instagram than any other social media site. It’s a ubiquitous part of teenage life, ”Brown writes. “Yet studies consistently show that the more teens use Instagram, the poorer their overall well-being, self-esteem, life satisfaction, mood, and body image. “

Read more: Facebook has known for a year and a half that Instagram is bad for teens despite claiming otherwise – here are the harms researchers have been documenting for years

3. Smoking the numbers on the damage

Meta has, unsurprisingly, rebuffed claims of harm despite disclosures in leaked internal documents. The company has provided research that shows its platforms do not cause damage in the ways many researchers have described, and says the overall picture for all damage research is unclear.

University of Washington computational sociologist Joseph Bak-Coleman explains that Meta’s research can be both precise and misleading. The explanation lies in the averages. Meta studies look at the effects on the average user. Since Meta’s social media platforms have billions of users, thousands of people can be harmed when all user experiences are added together.

“The inability of this type of research to capture the smaller but still significant number of people at risk – the tail end of the distribution – is compounded by the need to measure a range of human experiences in discrete increments,” he said. -he writes.

Read more: Thousands of vulnerable people hurt by Facebook and Instagram lost in Meta ‘average user’ data

4. Hide the numbers on disinformation

Just as evidence of emotional and psychological damage can be lost in averages, evidence of the spread of disinformation can be lost without the context of another kind of calculation: fractions. Despite substantial efforts to track down disinformation on social media, it’s impossible to know the extent of the problem without knowing the total number of posts that social media users see each day. And this is information that Meta does not make available to researchers.

The total number of posts is the denominator of the disinformation numerator in the fraction that tells you how serious the disinformation problem is, says Ethan Zuckerman of UMass Amherst, who studies social and civic media.

[Over 140,000 readers rely on The Conversation’s newsletters to understand the world. Sign up today.]

The denominator problem is compounded by the distribution problem, which is the need to determine where the disinformation is concentrated. “Just counting the cases of disinformation found on a social media platform leaves two key questions unanswered: how likely is it that users will encounter disinformation, and are some users particularly susceptible to it? be affected by disinformation? ” he wrote.

This lack of information is not unique to Meta. “No social media platform allows researchers to accurately calculate the importance of a particular content on its platform,” Zuckerman wrote.

Read more: Facebook has misinformation issue and blocks access to data on how much there is and who is affected

Editor’s Note: This story is a summary of articles from the Archives of The Conversation.

Comments are closed.