Meta’s Facebook and Instagram took action against more than 100 million pieces of ‘Child Endangerment’ content in 2023.
In 2023, the parent company of Facebook and Instagram, Meta, rejected more than a million content related to child endangerment caused by nudity, physical abuse and sexual abuse.
According to Meta India’s monthly reports, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, from January 2023 to December 2023 targeted 1,21,80,300 million such content.
The company said: “We measure the number of pieces of content (such as posts, photos, videos or comments) and take action to ensure that they are against our standards. This metric shows the extent of our enforcement actions. Actions may include removing content from Facebook or Instagram, or masking images or videos that may be disturbing to some audiences with a warning.”
Between January 1 and December 31, Facebook data showed that a total of 4,681,300 content items related to the topics “Child Endangerment – Nudity and Physical Abuse” and “Child Endangerment – Sexual Abuse” targeted the activity.
Meanwhile, Instagram saw a higher number, with 74,99,000 actions taken over the same period, including a significant proportion of sexual abuse, and 4 million incidents in January last year.
According to Meta: “In July 2018, we updated our methodology to clarify how many discrete pieces of content we have taken action on for violating our policies, and we continue to mature and improve our methodology as part of our commitment to providing the most accurate and meaningful metrics. Overall, our goal is to provide an accurate representation of the total number of content items we take action on action for violating our policies.”
However, it should be noted that Mark Zuckerberg, CEO of Meta, recently joined the leaders of Twitter, now called X, and TikTok in an appearance before the US Senate Judiciary Committee. This emergence comes as a result of growing concern among US lawmakers and parents about the impact of social media on young people’s lives.
As far as India is concerned, the central government has clearly stated its position, as the goal is to ensure that the Internet in India is open, safe and reliable and accountable to all Digital Nagriks.
Rule 4(2) of the IT Rules, 2021 requires major social media platforms to assist law enforcement agencies in identifying whistleblower content or child sexual abuse material (CSAM) related to various sensitive matters such as national security, foreign relations, public order and rape. .
In his parliamentary reply, Minister of State for Electronics and IT Rajeev Chandrasekhar stated: “The IT Rules 2021 impose specific legal obligations on intermediaries, including social media intermediaries and platforms, to ensure their responsibilities towards a safe and reliable internet, including their prompt action to remove such prohibited information , which are obscene, pornographic, pedophilic, invasive of another’s privacy including body protection, etc. including all false information, obviously false information and deep fakes.”
“If brokers fail to comply with their statutory obligations under the IT Rules 2021, they will lose their safe harbor protection under Section 79 of the Information Technology Act and will be liable to action or prosecution under any law for the time being in force, including the IT Act and the Indian Penal Code such as Sections 292 and 293 of the IPC,” he added.