A Meta spokesperson on Thursday told AFP that the company works "aggressively" to fight child exploitation and support police efforts to capture those involvedNews 

Report Reveals Instagram is Being Utilized by Pedophile Groups to Spread Child Sexual Abuse Material

A report by Stanford University and the Wall Street Journal has revealed that pedophile networks primarily use Instagram to advertise and distribute content featuring child sexual abuse.

“Large networks of accounts that appear to be run by minors are openly promoting self-produced child sexual exploitation material for sale,” said researchers at the Cyber Policy Center at the US university.

“Instagram is currently the most important platform for these networks, with features like recommendation algorithms and direct messages that help connect buyers and sellers.”

A Meta spokesperson told AFP on Thursday that the company is working “aggressively” to combat child abuse and support police efforts to catch those involved.

“Child abuse is a horrific crime,” a Meta spokesman said in response to an AFP inquiry.

“We are constantly exploring ways to proactively defend against this behavior and have established an internal task force to investigate these allegations and address them immediately.”

Meta teams dismantled 27 abuse networks between 2020 and 2022, and in January of this year terminated more than 490,000 accounts for violating the tech company’s child safety policies, the spokesperson added.

“We are committed to continuing our work to protect teenagers, deter criminals and support law enforcement to bring them to justice,” a Meta spokesperson said.

According to the Journal, a simple search using sexually explicit keywords that specifically refer to children leads to accounts that use those terms to promote content that depicts the sexual exploitation of minors.

The profiles “often claim to be directed by the children themselves and use overtly sexualized pseudonyms,” the article says.

While the accounts don’t specifically say they’re selling these images, they have menus with options, including in some cases specific sex acts.

Stanford researchers also found offers for videos featuring animals and self-harm.

“For a fee, children are available for one-on-one ‘meetings,'” the article continued.

Last March, pension and investment funds filed a complaint against Meta because it had “turned a blind eye” to images of human trafficking and child sexual abuse on its platforms.

By the end of last year, the technology introduced by Meta had removed more than 34 million pieces of child abuse content from Facebook and Instagram, all but a small portion of it automatically, according to the Silicon Valley technology company.

Related posts

Leave a Comment