The research found shocking findings on Facebook's algorithms. Check to know more. (Pexels)News 

Research Finds No Evidence of Facebook Algorithm Changing People’s Beliefs

Are social media echo chambers exacerbating political polarization, or merely mirroring pre-existing social divisions?

A major research project examining Facebook around the 2020 presidential election released its first results Thursday, finding that, contrary to assumptions, the platform’s oft-criticized content ranking algorithm does not shape users’ beliefs.

The work is the result of a collaboration between Meta, the parent company of Facebook and Instagram, and a group of researchers from US universities, who were given extensive access to the company’s internal data and who registered tens of thousands of users for the trials.

The academic team wrote four papers examining the social media giant’s role in American democracy, which were published in the scientific journals Science and Nature.

Overall, the algorithm was found to be “extremely influential in human environments,” said project leaders Talia Stroud of the University of Texas at Austin and Joshua Tucker of New York University.

In other words, it strongly influenced what users saw and how much they used the platforms.

“But we also know that changing the algorithm, even for a few months, is unlikely to change people’s political attitudes,” they said, as measured by users’ responses to surveys after they participated in three-month experiments that changed the content they received.

The authors acknowledged that this conclusion may be because the changes were not in place long enough to have an effect because the United States has been more polarized for decades.

Nevertheless, “these findings challenge popular narratives that blame social media echo chambers for the problems of modern American democracy,” wrote the authors of one paper published in the journal Nature.

– “No silver bullets” –

Facebook’s algorithm, which uses machine learning to decide which posts rise to the top of users’ feeds based on their interests, has been accused of creating “filter bubbles” and enabling the spread of misinformation.

The researchers recruited about 40,000 volunteers through invitations on their Facebook and Instagram feeds, and designed an experiment in which one group was exposed to a normal algorithm, while the other saw posts from newest to oldest.

Facebook originally used a reverse chronological system, and some observers have suggested that the move will reduce the harmful effects of social media.

The team found that users in the chronological feed group spent about half the time on Facebook and Instagram compared to the algorithm group.

On Facebook, those in the chronological group saw more content from moderate friends and more sources with ideologically mixed audiences.

But the chronological feed also increased the amount of political and unreliable content users saw.

Despite the differences, the changes did not cause noticeable changes in the measured political attitudes.

“The findings suggest that chronological input is not a silver bullet for things like political polarization,” said Stanford researcher Jennifer Pan.

Meta is satisfied with the findings –

In another Science publication, the same team examined the impact of reshared content, which accounts for more than a quarter of the content seen by Facebook users.

Blocking redistribution has been proposed as a way to control malicious viral content.

The team conducted a controlled experiment in which a group of Facebook users saw no changes to their feeds, while another group had the reshared content removed.

Removing redistricting reduced the proportion of political content seen, leading to a decrease in political knowledge, but again had no effect on downstream political attitudes or behavior.

The third Nature publication examined the influence of content from “like-minded” users, pages and groups in their feeds. The researchers found that they make up the majority of what is seen by the total number of active adult Facebook users in the United States.

But in an experiment involving more than 23,000 Facebook users, suppressing like-minded content had no effect on ideological extremism or belief in false claims.

However, a fourth article in the journal Science confirmed extreme “ideological segregation” on Facebook, with politically conservative users being harder on their news sources than liberals.

In addition, 97 percent of political news URLs on Facebook that were deemed false by Meta’s third-party fact-checking program – which includes AFP – were seen more by conservatives than by liberals.

Meta was satisfied with the overall results.

They “add to a growing body of research showing that there is little evidence that social media causes harmful … polarization or has a significant impact on key political attitudes, beliefs or behaviours,” said Nick Clegg, the company’s head of global affairs.

Related posts

Leave a Comment