Exploring the Contrast between Meta’s Welcoming Discussions and Hostile Online Environments
In a bid to create a more amicable space for online conversations, Mark Zuckerberg has presented Threads, Meta’s app resembling Twitter, as a welcoming platform for public discourse. He emphasized its stark contrast to the confrontational nature of Twitter, which is owned by billionaire Elon Musk.
“We’re definitely focused on friendliness and making this a friendly place,” Meta CEO Zuckerberg said Wednesday, shortly after the service launched.
Maintaining that idealistic vision for Threads — which attracted more than 70 million users in its first two days — is another story.
To be sure, Meta Platforms is no novice at managing the raging, spamming internet hordes. The company said it will impose the same rules on users of the new Threads app as it maintains on Instagram, the photo and video sharing social media.
The owner of Facebook and Instagram has also been actively taking an algorithmic approach to serving content, giving it more control over what kind of fare succeeds, and is trying to steer more towards entertainment and away from news.
By integrating Threads with other social media services such as Mastodon, and considering microblogging’s appeal to news junkies, politicians and other lovers of rhetorical combat, Meta is taking on new challenges with Threads and aiming to chart a new path through them.
First, the company is not expanding its existing fact-checking program to Threads, spokeswoman Christine Pai said in an email Thursday. This eliminates a distinguishing feature of how Meta has managed misinformation in its other applications.
Pai added that posts on Facebook or Instagram that have been deemed false by fact-checking partners – which include a Reuters unit – will carry the labels if they are also posted on Threads.
Meta declined to respond when asked by Reuters to explain why it took a different approach to Threads’ misinformation.
Instagram executive Adam Mosseri acknowledged on a New York Times podcast Thursday that Threads “supported more public conversation” than Meta’s other services and was therefore more willing to attract a news-focused audience, but said the company’s goal was to focus on lighter topics such as sports, music, fashion and to design.
Nevertheless, Meta’s ability to distance himself from the controversy was immediately challenged.
Within hours of the release, Threads accounts seen by Reuters posted about the Illuminati and “billionaire Satanists,” while other users compared each other to Nazis and fought over everything from gender identity to violence in the West Bank.
Conservative figures, including the son of former US President Donald Trump, complained of censorship after the stickers warned potential followers that they had posted false information. Another Meta spokesperson said the stickers were a mistake.
TO FEDIVERSE
More challenges for content moderation are in store when Meta links threads to the so-called fediverse, where users of servers maintained by other non-Meta units can communicate with users of Threads. Metan Pai said Instagram’s rules also apply to these users.
“If an account or server, or if we find multiple accounts on a given server, are found to be violating our rules, they will be banned from Threads, meaning that server’s content will no longer appear in threads and vice versa,” he said.
Still, researchers specializing in online media said the devil would be in the details of how Meta approaches these interactions.
Alex Stamos, director of Stanford’s Internet Observatory and former chief security officer at Meta, posted on Threads that the company would face greater challenges in carrying out key content policing efforts without access to the background information of users posting banned content.
“Within the Federation, the metadata that the big platforms use to tie accounts to a single actor or detect abuse on a large scale is not available,” Stamos said. “This makes it harder to stop spammers, troll farms and financially driven abusers.”
In his message, he said he expects Threads to limit the visibility of multiple abusive accounts on multiple servers and apply tougher penalties to those posting illegal material such as child pornography.
Nevertheless, the interaction itself poses challenges.
“There are some really strange complications that arise when you start thinking about illegal things,” said Solomon Messing of New York University’s Center on Social Media and Politics. He cited examples such as child abuse, explicit sexual images and arms sales.
“If you come across this kind of material while crawling content (from other servers), do you have any responsibility other than blocking it from Threads?”