CEOs of Meta, X, TikTok, and other social media platforms face intense Senate hearing on child exploitation
Children’s advocates and lawmakers argue that companies are failing to adequately safeguard young people from a range of issues on social media, including sexual predators, addictive features, suicide and eating disorders, unrealistic beauty standards, and bullying.
On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify as lawmakers and parents grow increasingly concerned about the impact of social media on young people’s lives.
We are on WhatsApp channels. Click to join.
The hearing began with recorded testimonies from children and parents who said they or their children were being exploited on social media. During the hours-long event, parents who lost children to suicide silently held pictures of their dead children.
“They are responsible for many of the dangers our children face online,” U.S. Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. “Their design choices, their failure to invest adequately in trust and security, their continued commitment to and profiteering from basic security have all put our children and grandchildren at risk.”
In a heated question-and-answer session with Meta CEO Mark Zuckerberg, Republican Missouri Senator Josh Hawley asked the Meta CEO if he has personally compensated any of the victims and their families for what they went through.
“I don’t think so,” Zuckerberg replied.
“There are families of victims here,” Hawley said. “Do you want to apologize to them?”
As parents stood up and handed out pictures of their children, Zuckerberg turned to them and apologized for what they had experienced.
Hawley continued to press Zuckerberg, asking if he would take personal responsibility for any harm caused by his company. Zuckerberg stayed on message, reiterating that Meta’s mission is to “build industry-leading tools” and empower parents.
“I’m making money,” Hawley interrupted.
South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin’s sentiments and said he was willing to work with Democrats to resolve the issue.
“After years of working on this issue with you and others, I have come to the conclusion that social media companies as they are currently designed and operated are dangerous products,” Graham said.
He told executives that their platforms have enriched lives, but it’s time to address the “dark side.”
Starting with Discord’s Jason Citron, leaders touted existing security tools on their platforms and the work they’ve done with organizations and law enforcement to protect minors.
Snapchat had broken ranks before the hearing and began supporting a federal bill that would create legal liability for apps and social platforms that recommend harmful content to minors. Snap CEO Evan Spiegel on Wednesday reiterated his support for the company and asked the industry to support the bill.
TikTok CEO Shou Zi Chew said TikTok is enforcing its policy that prevents children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, does not serve children.
“We don’t have a business dedicated to children,” Yaccarino said. He said the company also supports the Stop CSAM Act, a federal law that makes it easier for victims of child abuse to sue tech companies.
Still, child health advocates say social media companies have repeatedly failed to protect minors.
“When you’re faced with really important security and privacy decisions, revenue shouldn’t be the first factor these companies consider,” said Zamaan Qureshi, president of the youth-led organization Design It For Us. “These companies have had opportunities to do this before they didn’t. So independent regulation needs to step in.”
Dozens of states are suing Meta, alleging that it intentionally designs features into Instagram and Facebook that make children addicted to its platforms and has failed to protect them from online predators.
New internal emails between Meta executives released by Sen. Richard Blumenthal’s office show that global affairs chief Nick Clegg and others are asking Zuckerberg to hire more people to strengthen “well-being across the company” as concerns grow about the impact on young people’s mental health.
“From a political perspective, this work has become increasingly urgent in recent months. Politicians in the U.S., U.K., E.U. and Australia are expressing public and private concerns about the effects of our products on the mental health of young people,” Clegg wrote in an August 2021 email.
The emails released by Blumenthal’s office do not appear to include a response, if any, from Zuckerberg. In September 2021, The Wall Street Journal published the Facebook Files, based on the internal documents of whistleblower Frances Haugen, who later testified before the Senate.
Meta has beefed up its child safety features in recent weeks and announced earlier this month that it will begin hiding inappropriate content from teenagers’ Instagram and Facebook accounts, including posts about suicide, self-harm and eating disorders. It also limited minors’ ability to receive messages from anyone they don’t follow or connect with on Instagram and Messenger, and added new “bumps” to try to prevent teens from scrolling through Instagram videos or messages late at night. The nods encourage children to close the app, although it doesn’t force them to do so.
But child safety advocates say its efforts by the companies have failed.
“When I look back every time there’s been a Facebook scandal or an Instagram scandal over the last few years, they use the same playbook. Meta cherry-picks their statistics and talks about features that don’t address those disadvantages,” said Arturo Béjar, the social media giant’s former director of engineering, who is known for his expertise in curbing cyberbullying and recently testified before Congress about the safety of children on Meta’s platforms.
Google’s YouTube is notably absent from the list of companies subpoenaed by the Senate on Wednesday, even though more children use YouTube than any other platform, according to the Pew Research Center. Pew found that 93 percent of US teens use YouTube, with TikTok a distant 63 percent.