Mastodon’s Decentralized Social Network Struggles with Child Sexual Abuse Material
Mastodon’s popularity has surged in the last year as Twitter users sought alternatives after Elon Musk’s involvement. Its decentralized nature, which protects it from impulsive statements by billionaires, is a significant factor in its appeal. However, this very aspect has also posed challenges, making content moderation nearly impracticable.
The Stanford study found 112 matches of known child sexual abuse material (CSAM) over a two-day period, with nearly 2,000 posts using common abuse-related hashtags. Researcher David Thiel says, “We got more photoDNA shares in two days than we’ve probably had in the entire history of our organization doing any kind of social media analysis, and it’s not even close.” We’ve reached out to Mastodon for comment and will update this story when we hear back.
Of course, the big problem with disconnected social media platforms like Mastodon is that no one company or community controls everything on the platform. Each instance has its own administrators, who are ultimately responsible. However, these administrators cannot control and monitor what happens on other instances or servers.
This isn’t exclusively a Mastodon problem either. Meta’s popular Threads is also built around a decentralized model. Although not yet supported, Threads plans to be interoperable with ActivityPub. This means that Threads users can follow, reply and repost Mastodon content and vice versa.
This creates a unique problem for Metal, which cannot control the entire moderation flow like Facebook or Instagram. Even then, the company struggles to stay afloat. Presumably larger instances in Mastodon and other platforms like Threads can prevent access to problematic instances. Of course, that wouldn’t “solve” the problem. The content would still exist. It would just be silenced and left up to the moderators of that instance to remove it.