US Senators Grill Former Chiefs of Meta, TikTok, and X Over Child Sexual Exploitation
WASHINGTON: US senators grilled major social media and communications companies on Wednesday, saying they had not done enough to protect children from sexual predators on their platforms, and said Congress must pass legislation quickly.
The hearing marks the latest effort by lawmakers to address concerns from parents and mental health experts that social media companies are using their profits to ensure their platforms don’t harm children.
“Mr. Zuckerberg, you and the companies in front of us, I know you don’t mean it, but you have blood on your hands,” said Sen. Lindsey Graham, referring to Meta CEO Mark Zuckerberg. “You have a product that kills people.”
Zuckerberg testified alongside X CEO Linda Yaccarino, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chew and Discord CEO Jason Citron.
Senator Dick Durbin, the Democratic chairman of the Judiciary Committee, cited statistics from the National Center for Missing and Exploited Children, a nonprofit group, that showed financial “extortion,” in which a predator tricks a minor into sending explicit photos and videos, had skyrocketed last year.
“This disturbing increase in child sexual abuse is driven by one thing: changes in technology,” Durbin said at the hearing.
As the hearing began, the committee played a video in which children talked about being victimized on social media platforms.
“I was sexually abused on Facebook,” said one shadowy child in the video.
In the hearing room, dozens of parents were standing waiting for the CEO to arrive, holding pictures of their children.
X’s Yaccarino said the company supported the STOP CSAM Act, legislation introduced by Durbin that seeks to hold tech companies accountable for child sexual exploitation material and allow victims to sue tech platforms and app stores.
The bill is one of many bills aimed at children’s safety. None has become law.
X, formerly Twitter, has come under heavy criticism since Elon Musk bought the platform and loosened moderation policies. This week, the company blocked pop singer Taylor Swift’s searches after false sexually revealing photos of Swift went viral on the platform.
On Wednesday, TikTok CEO Chew also appeared before US lawmakers for the first time since March, when the Chinese-owned short video app company faced tough questions, some of which suggested the app harms children’s mental health.
“We make careful product design choices to help make our app inhospitable to those who want to harm teenagers,” Chew said, adding that TikTok’s community guidelines strictly prohibit anything that puts “teenagers at risk of exploitation or other harm — and we police them vigorously.”
Chew revealed that more than 170 million Americans used TikTok each month — 20 million more than the company reported last year.
Under questioning from Graham, he said TikTok would spend more than $2 billion on trust and security efforts, but declined to say how the figure compares to the company’s total revenue.
Sen. Ted Cruz grilled Zuckerberg about Instagram’s warning screens that warned users about an image that might contain child sexual abuse, but still allowed them to see the image.
“Mr. Zuckerberg, what the hell were you thinking?” Cruz said.
Zuckerberg responded that it could be useful to redirect users to resources instead of blocking content, adding that the company will learn more about the announcement.
He reiterated earlier in the hearing that the company had no plans to move forward with the previous idea of creating a children’s version of Instagram.
Senator Amy Klobuchar on Wednesday questioned the failure of the technology, comparing it to the response shown when a panel blew out of a Boeing plane earlier this month.
“When a Boeing plane lost its door in flight several weeks ago, no one questioned the decision to ground the fleet… So why aren’t we taking the same decisive action on the dangers of these platforms when we know children are dying?” Klobuchar said.