A video of Biden voting with his adult granddaughter, manipulated to falsely appear that he inappropriately touched her chest, went viral last yearNews 

Calls for Meta to Revise Regulations Following False Post About US President Joe Biden

With a major election looming, Meta’s policy on deep fake content is in urgent need of an update, the watchdog said Monday in its ruling on a manipulated video of US President Joe Biden.

A video of Biden voting with his adult granddaughter, manipulated to falsely appear to have inappropriately touched his breast, went viral last year.

It was reported to Metal and later to the company’s supervisory board as hate speech.

The technology giant’s oversight board, which independently reviews Meta’s content control decisions, said the platform was technically correct to leave the video online.

But it also claimed that the company’s rules on manipulated content were no longer appropriate.

The board’s warning came amid fears of an explosive misuse of AI-based applications for disinformation on social media platforms in a crucial election year not only in the United States but globally, when a huge portion of the world’s population goes to the polls.

The board said Meta’s policy in its current form was “incoherent, lacking compelling rationale and inappropriately focused on content creation.”

This was instead of focusing on “specific harms it seeks to prevent (for example, electoral processes),” the government added.

In its response, Meta said it would “review the oversight board’s instructions and publicly respond to their recommendations within 60 days in accordance with the rules.”

According to the board, the rules were not violated in Biden’s case “because the video was not manipulated by artificial intelligence and did not show Biden saying something he did not”.

However, the government insisted that “the content altered by the fake is common and not necessarily less misleading”.

For example, most smartphones have easy-to-use features to edit content into disinformation, sometimes called “cheap fakes,” it noted.

The board also emphasized that altered audio content, unlike videos, is outside the current scope of the policy, although deep fake audio can be very effective at tricking users.

Already, one U.S. robocall impersonating Biden urged New Hampshire residents not to vote in the Democratic primary, prompting state officials to launch an investigation into possible voter suppression.

The Supervisory Board urged Meta to reconsider the manipulated media policy “quickly, considering the number of elections in 2024″.

Related posts

Leave a Comment