An oversight board is criticizing Facebook owner Meta's policies. (Unsplash)News 

Meta Platforms, owner of Facebook, urged by oversight board to reconsider its stance on manipulated media

The policies of Meta, the owner of Facebook, regarding manipulated media are being criticized by an oversight board for being “incoherent” and inadequate in tackling the growing issue of online disinformation, which has already started to impact elections worldwide this year.

The quasi-independent government said Monday that its review of an altered video of President Joe Biden that went viral on Facebook revealed flaws in the policy. The government said Meta should expand its policy to focus not only on AI-generated videos, but media regardless of how it was created. This includes fake audio recordings of political candidates in the United States and elsewhere.

The company should also identify the harm it is trying to prevent and flag images, videos and audio clips as manipulated rather than removing the posts entirely, the Meta Oversight Board said.

We are on WhatsApp channels. Click to join.

The government’s response reflects the intense scrutiny many tech companies face over their handling of election fraud during a year when voters in more than 50 countries go to the polls. As both generative artificial intelligence c and lower-quality “cheap fakes” on social media threaten to mislead voters, platforms are trying to catch and respond to false messages while protecting users’ rights to free speech.

“As it stands, the policy makes little sense,” Board of Supervisors Chairman Michael McConnell said in a statement Monday about Meta’s policy. He said the company should close policy loopholes while ensuring political speech is “weakly protected.”

Meta said that it will review the supervisory board’s instructions and respond to the recommendations publicly within 60 days.

Spokesman Corey Chambliss said that while audio deepfakes are not mentioned in the company’s manipulated media policy, they can be fact-checked and flagged or downgraded if fact-checkers judge them to be false or altered. The company will also take action against any content that violates Facebook’s Community Guidelines, he said.

Facebook, which turned 20 this week, is the most popular social media site among Americans, according to Pew. But other social media sites like Meta’s Instagram, WhatsApp and Threads, as well as X, YouTube and TikTok are also potential hubs for fraudulent media to spread and deceive voters.

Meta established its supervisory board in 2020 to act as a referee for the content of its platforms. Its current recommendations come after it reviewed an edited clip of President Biden and his adult granddaughter that was misleading but did not violate the company’s specific policies.

In the original footage, Biden placed an “I Voted” sticker high up on his granddaughter’s chest following her instructions, then kissed her on the cheek. The version that appeared on Facebook was edited to remove important context, making it appear as if he had touched her inappropriately.

The board’s decision Monday upheld Meta’s 2023 decision to leave the seven-second clip on Facebook because it did not violate the company’s existing manipulated media policy. Meta’s current policy says it will remove videos created with AI tools that misrepresent someone’s speech.

“Because the video in this message has not been altered by artificial intelligence and shows President Biden doing something he did not do (not something he did not say), it does not violate existing policy,” the ruling said.

The board advised the company to update the policy and cut similar videos as manipulated in the future. It argued that to protect users’ free speech, Meta should flag content as manipulated rather than removing it from the platform if it doesn’t violate other policies.

The government also stated that some manipulated media is made for humor, parody or satire and should be protected. Instead of focusing on the creation of a distorted image, video or audio clip, the company’s policy should focus on the harm that manipulated messages can cause, such as disrupting the election process, the decision states.

Meta said on its website that it is satisfied with the supervisory board’s decision about Biden’s message and will update the message after reviewing the government’s recommendations.

Meta is obliged to comply with the supervisory board’s decisions on certain content decisions, although it is not obliged to comply with the board’s broader recommendations. Still, the government has gotten the company to make some changes over the years, including messaging users who violate its policies to more specifically explain to them what they did wrong.

Jen Golbeck, a professor at the University of Maryland School of Information Studies, said Meta is big enough to be a leader in flagging manipulated content, but monitoring is as important as changing policy.

“Will they implement the changes and then implement them in the face of political pressure from people who want to do bad things? That’s the real question,” he said. “If they make these changes and don’t enforce them, it kind of adds to this misinformation destruction of trust.”

Related posts

Leave a Comment