Over 22 Million Pieces of Harmful Content Removed from Facebook and Instagram in India
Meta said it removed more than 17.8 million bad content from 13 Facebook policies and more than 4.8 million inappropriate content from 12 Instagram policies in India in January.
In January, Facebook received 29,548 complaints through India’s complaint mechanism and said it provided tools for users to resolve issues in 21,060 cases.
These include, for example, pre-defined channels for reporting content for certain violations, self-remediation streams where they can download their data, ways to resolve hacked accounts, etc., Meta said in its monthly report under IT (Intermediary Guidelines and Digital Media). Code of Ethics) Rules, 2021.
“Of the remaining 8,488 reports that required special review, we analyzed the content in accordance with our policies and took action on a total of 4,632 complaints. The remaining 3,856 complaints were addressed but may not have been acted upon,” Meta added.
On Instagram, the company received 19,311 reports through India’s complaint mechanism.
“Of these, we provided users with tools to resolve issues in 9,476 cases,” it said.
Of the other 9,835 notifications that required a special inspection, Meta analyzed the content and took action on a total of 4,849 complaints.
The remaining 4,986 reports were reviewed but may not have been acted upon.
Under the new IT Rules 2021, large digital and social media platforms with more than 5 million users must publish monthly compliance reports.
“We measure the number of pieces of content (such as posts, photos, videos, or comments) that we take action on for actions that violate our standards. Actions could include removing content from Facebook or Instagram, or masking images or videos that may be disturbing to some audiences with a warning,” Meta said.
In December 2023, Meta removed more than 19.8 million pieces of content from 13 Facebook policies and more than 6.2 million pieces of content from 12 Instagram policies.