The case involves an image of Squidward edited to say that the Holocaust didn't happen.News 

Meta’s Oversight Board Increases Investigation into Holocaust Denial

The Oversight Board of Meta is focusing on a new case that aligns with its strategic goals. In a recent announcement, the board stated that it will be examining a case related to Meta’s decision not to remove content denying the Holocaust on its platforms. The specific instance involves a post on Instagram featuring an image of Squidward from SpongeBob SquarePants with a speech bubble denying the occurrence of the Holocaust. The post’s caption and hashtags were also aimed at specific geographical audiences. The board will be reviewing this case and inviting public comments over the next few weeks.

The post was originally published by an account with 9,000 followers in September 2020 and was viewed approximately 1,000 times. A few weeks after that, Meta revised its content policies to prevent Holocaust denial. Despite the new rules and multiple users reporting, the post was not quickly removed. Some of the reports were automatically closed due to the company’s “automation policies related to the COVID-19 situation,” which were put in place to allow Meta’s limited number of reviewers to prioritize reports deemed “high risk.” Other editors were automatically told that the content did not violate Meta’s policies.

One of the users who reported the post complained about the incident to the government, which has said it is in line with its efforts to prioritize “hate speech from marginalized groups.” The government is now asking for comments on several relevant issues, such as the use of automation to combat hate speech and the usefulness of Meta’s transparency reporting.

In a post on Meta’s transparency page, the company has admitted that it left the content up after the first review. However, it ultimately decided that it was filed in error and that it violated its hate speech policy. The company has since removed the content from its platforms, but promised to implement the government’s decision. Meta’s supervisory board can make policy recommendations based on its investigation, but they are not binding, and the company is not obliged to follow them. Based on the questions the government wants the public to answer, it could conjure up recommendations that would change the way Meta uses automation to police Instagram and Facebook.

Related posts

Leave a Comment