A report identified 35 cases that led to physical offenses following communication on the site.News 

Discord’s Child Safety Problem Persists

Discord’s child safety concerns have been highlighted in a recent report, which has uncovered concerning statistics. NBC News has identified 35 cases over the past six years where adults have been prosecuted for alleged “kidnapping, grooming or sexual assault” involving Discord communication. Of these cases, at least 15 have resulted in guilty pleas or verdicts, while “many others” are still awaiting resolution.

The reporters also found 165 other cases, including four gangs, in which adults were accused of sharing CSAM (child sexual abuse material) through Discord, or of allegedly using the site to send children sexually graphic images of themselves. This practice is known as sex extortion. According to the report, illegal acts often take place in hidden communities and chat rooms.

A simple Google search for “site:justice.gov Discord” yields a large number of hits, many of which are disturbing in nature. In one case identified by NBC News, “a teenager was taken across state lines, raped and found locked in a backyard fence, police say, after being treated at Discord for months.”

“What we’re seeing is just the tip of the iceberg,” Stephen Sauer of the Canadian Center for Child Protection told NBC News. And this isn’t the first time Discord has come under fire for its handling of child abuse complaints. Last year, CNN also identified numerous cases of CSAM, with some parents claiming that Discord offered little help.

Earlier this year, the National Center on Sexual Exploitation (NCOSE) issued a statement titled “Discord reacts dishonestly to how it handles child sexual exploitation material after being named to the 2023 Dirty Dozen list.” Among other things, it noted that the CSAM links were identified and reported, but are still available on Discord’s servers “after more than two weeks.” It added that Discord’s actions in responding to the issues are “too passive and fail to proactively seek out and remove exploits.” It recommended that the site, which currently has more than 150 million users, ban minors “until it changes radically.”

In a recent transparency report, Discord said its “investment and prioritization in child safety has never been stronger,” adding that it disabled 37,102 accounts and removed 17,425 servers for child safety violations. John Redgrave, the company’s director of trust and security, told NBC News that he believes the platform’s approach to the problem has improved since Discord acquired artificial intelligence moderation company Sentropy in 2021. It uses multiple systems to proactively detect CSAM and analyze user behavior. Redgrave said he thinks the company is now proactively detecting most material that has already been “identified, reviewed and indexed.”

However, the systems are currently unable to detect child sexual abuse materials or messages that have yet to be indexed. In a review of Discord servers created in the past month, NBC News found 242 servers using thinly disguised terms to market CSAM.

Discord isn’t the only social media company with CSAM issues. A recent report found that Instagram helped “connect and promote a vast network of accounts” dedicated to content aimed at minors. However, Discord has reportedly made it difficult for law enforcement at times, and in one case a payment was requested after Ontario police asked it to preserve documents, according to the report. ReturnByte as has contacted Discord for comments.

Related posts

Leave a Comment