Facebook and Instagram Terminate Worldwide Rules on False Covid-19 Information
On Friday, Meta Platforms announced that its global misinformation policy regarding Covid-19 on Facebook and Instagram would no longer be applicable.
Social media platforms such as Facebook and Twitter have come under enormous pressure to address misinformation surrounding the pandemic, including false claims about vaccines, prompting them to crack down.
Earlier in 2021, Facebook announced that it had removed 1.3 billion fake accounts between October and December, and removed more than 12 million pieces of content about COVID-19 and vaccines that global health experts have deemed misinformation.
In July last year, Facebook’s parent asked an independent oversight board for its opinion on changes to its current approach as authentic sources of information and public awareness of the coronavirus have improved.
However, Meta said on Friday that the rules would remain in effect in countries that still have a Covid-19 public health emergency declaration, and the company will continue to remove content that violates the coronavirus misrepresentation practices.
“We are consulting with health experts to understand which claims and categories of misinformation may continue to pose this risk,” Meta said in a blog post.
Earlier in November, Twitter also withdrew its Covid-19 disinformation policy.