2024 Election Polarization Foiled: Thousands of Fake Facebook Accounts Shut Down by Meta!
On Thursday, Meta revealed that an individual in China has been responsible for generating numerous counterfeit social media profiles that mimic American users. These accounts were utilized to disseminate divisive political material, seemingly with the intention of sowing discord within the United States in anticipation of the upcoming elections in the following year.
A network of nearly 4,800 fake accounts was trying to build an audience when the tech company that owns Facebook and Instagram identified and eliminated it. The accounts had fake photos, names and locations to make them look like American Facebook users discussing political issues.
Instead of spreading fake content, as other networks have done, the accounts were used to reshare posts created by politicians, news outlets and others from X, the platform formerly known as Twitter. The combined accounts collected content from both liberal and conservative sources, indicating that its goal was not to support either side, but to exaggerate party divisions and further inflame polarization.
The newly identified network shows how America’s foreign adversaries are exploiting US-based technology platforms to sow discord and mistrust, hinting at serious threats posed by online disinformation next year when national elections are held in the US, India, Mexico, Ukraine, Pakistan, Taiwan and others lands.
“These networks are still struggling to build an audience, but they’re a warning,” said Ben Nimmo, who leads investigations into fraudulent behavior on Meta’s platforms. “Foreign threat actors are trying to reach people via the internet before next year’s elections, and we need to remain vigilant.”
Meta Platforms Inc., based in Menlo Park, Calif., did not publicly link the Chinese network to the Chinese government, but it identified the network as originating from that country. The content disseminated by the accounts widely complements other Chinese government propaganda and disinformation that has sought to increase partisan and ideological divisions in the United States.
To look more like regular Facebook accounts, the network sometimes posted about fashion or pets. Earlier this year, some accounts suddenly replaced their American-sounding usernames and profile pictures with new ones that suggested they lived in India. The accounts then began spreading positive Chinese content from Tibet and India, illustrating how fake networks can be redirected to focus on new destinations.
Meta often points to its efforts to shut down fake social media networks as evidence of its commitment to protecting electoral integrity and democracy. But critics say the platform’s focus on fake accounts distracts from its failure to address misinformation already on its site, which has fueled polarization and mistrust.
For example, Meta accepts paid ads on its site to claim that the 2020 US election was rigged or stolen, reinforcing the lies of former President Donald Trump and other Republicans whose claims of election fraud have been repeatedly debunked. Federal and state election officials, as well as Trump’s own attorney general, have said there is no credible evidence that the presidential election, which Trump lost to Democrat Joe Biden, was tainted.
When asked about its advertising policy, the company said it focuses on upcoming elections, not past ones, and rejects ads that cast unfounded doubt about future races.
And while Meta has announced a new AI policy that requires political ads to include a disclaimer if they contain AI-generated content, the company has allowed other modified videos created with more traditional programs to remain on the platform, including digital video. edited video of Biden claiming to be a pedophile.
“This is a company that cannot be taken seriously and cannot be trusted,” said Zamaan Qureshi, a policy adviser at the Real Facebook Oversight Board, an organization of civil rights leaders and technology experts that has criticized Meta’s approach to misinformation and disinformation. hate speech. “See what the Meta does, not what they say.”
Meta executives discussed the network’s operations during a conference call with reporters on Wednesday, a day after the tech giant announced its policies for the upcoming election year — most of which were put in place for the previous election.
But 2024 poses new challenges, according to experts studying the link between social media and disinformation. Not only are many major countries holding national elections, but the emergence of sophisticated artificial intelligence software means it’s easier than ever to create lifelike audio and video that can mislead voters.
“Platforms are still not taking their role in the public sphere seriously,” said Jennifer Stromer-Galley, a Syracuse University professor who studies digital media.
Stromer-Galley called Meta’s election plans “modest,” but noted that they stand in stark contrast to X’s “Wild West.” Since Elon Musk bought the X platform, then called Twitter, it has eliminated teams focused on content moderation and welcomed many users back. previously banned for hate speech and used the site to spread conspiracy theories.
Democrats and Republicans have pushed for laws to address algorithmic recommendations, misinformation, deep-pocketed fakes and hate speech, but there’s little chance of major legislation being passed before the 2024 election. This means that it is up to the platforms to voluntarily monitor themselves.
Meta’s efforts to protect elections are “a terrible preview of what we can expect in 2024,” says Kyle Morse, deputy director of the Tech Oversight Project, a nonprofit that supports new federal regulations for social media. “Congress and the administration must act now to ensure that Meta, TikTok, Google, X, Rumble and other social media platforms do not actively assist foreign and domestic actors who openly undermine our democracy.”
Many of the fake accounts identified by Meta this week also had nearly identical accounts on X, where some of them regularly retweeted Musk’s messages.
These accounts remain active on X. A message seeking comment from the platform was not returned.
Meta also released a report on Wednesday assessing the risk that foreign adversaries such as Iran, China and Russia would use social media to interfere in the election. The report noted that Russia’s recent disinformation efforts have focused not on the United States but on its war against Ukraine, using state media propaganda and disinformation to undermine support for the invaded nation.
Nimmo, Meta’s principal investigator, said the turn of opinion against Ukraine is likely to be the focus of all the disinformation Russia seeks to inject into the American political debate ahead of next year’s election.
“This is important before 2024,” Nimmo said. “As the war continues, we should especially expect Russian efforts to target election-related debates and candidates focused on supporting Ukraine.”