Study reveals impact of Meta’s algorithms on 2020 election content on users’ feeds
Almost three years ago, Meta made an announcement about its collaboration with over a dozen independent researchers to examine the influence of Facebook and Instagram on the 2020 election. The project aimed to provide an unbiased analysis of topics such as polarization and misinformation, utilizing extensive internal data. Both Meta and the researchers assured the public of delivering an impartial perspective.
Now we have the first results from this research in the form of four peer-reviewed papers published in the journals Science and Nature. The research offers a fascinating new look at how Facebook and Instagram’s algorithms affected what users saw ahead of the 2020 presidential election.
The papers are also a significant milestone for Metal. The company has at times had a strained relationship with independent researchers and has been accused of “openness theater” in its efforts to provide more information to those who want to understand what is happening on this platform. Meta’s head of policy, Nick Clegg, said in a statement that the research suggests Facebook may not be as influential in shaping its users’ political beliefs as many believe. “The experimental studies add to a growing body of research showing that there is little evidence that key features of Meta platforms alone cause harmful ‘affective’ polarization or have significant effects on key political attitudes, beliefs, or behaviors,” he wrote.
However, the researchers’ preliminary findings seem to paint a more complex picture.
One study published in the journal Nature looked at the effect of so-called “echo chambers,” or when users are exposed to a large number of “like-minded” sources. While the researchers confirm that most US users see the majority of content from “like-minded friends, pages and groups,” they note that not all of it is specifically political or news-related. They also found that reducing the amount of “like-minded” content reduced engagement but did not measurably change user beliefs or attitudes.
While the authors note that the results do not account for the “cumulative effects” of years of social media use on their subjects, they do suggest that the effects of echo chambers are often misrepresented.
Another study published in the journal Nature looked at the effect of chronological inputs on algorithmically generated inputs. This question became particularly prominent in 2021 thanks to the revelation of whistleblower Frances Haugen, who has advocated a return to chronological feeds. Unsurprisingly, the researchers concluded that Facebook and Instagram’s algorithmic feeds “strongly influenced user experience.”
“The chronological feed dramatically reduced the amount of time users spent on the platform, reduced how much users engaged with content while on the platform, and changed the mix of content served to them,” the authors write. “Users saw more content from ideologically moderate friends and sources with mixed audiences; more political content; more content from unreliable sources; and less content classified as profanity or misrepresentation than they would in Algorithmic Feed.”
Meanwhile, the researchers say that the chronological feed “did not produce detectable changes in downstream political attitudes, knowledge, or offline behavior.”
Similarly, another study, also published in Science, on the effects of reshared content before the 2020 election found that removing reshared content “significantly reduces the amount of political news, including content from unreliable sources,” but not “significantly.” affect political polarization or any measure of individual-level political attitudes.
Finally, the researchers analyzed the political news in users’ feeds in terms of whether they were liberal or conservative. They concluded that Facebook is “substantially ideologically segregated”, but that “ideological segregation is much more evident in content posted by pages and groups than in content posted by friends”. They also found that conservative users were much more likely to see content from “untrusted” sources, as well as articles judged to be false by the company’s third-party fact-checkers.
The researchers said the results were “an indication of how Pages and Groups provide a highly efficient curation and dissemination engine that is used particularly effectively by sources with a predominantly conservative audience.”
While some of the findings look good for Metal, who has long argued that political content is only a small minority of what most users see, one of the most significant takeaways from the study is that there are no obvious solutions to addressing polarization. does on social media. “The results of these experiments do not show that the platforms are not the problem, but they show that they are not the solution,” David Garcia of the University of Konstanz, who was part of the research team, told Science.