Report recommends improvements to CyberTipline for online child exploitation before AI exacerbates the issue
According to a recent report from the Stanford Internet Observatory, a tipline established 26 years ago to address online child exploitation has not fully met its objectives and requires technological enhancements and other improvements to better assist law enforcement in pursuing offenders and saving victims.
Repairs to the “extremely valuable” service for researchers also need to be made urgently, as new artificial intelligence technology threatens to exacerbate its problems.
“In the years to come, the CyberTipline will almost certainly be flooded with highly realistic-looking AI content, making it even more difficult for law enforcement to identify real children who need to be rescued,” said researcher Shelby Grossman. author of the report.
Congress established the service as the main line of defense for children who have been abused online. By law, tech companies must report child sexual exploitation material from all of their platforms to a system run by the National Center for Missing and Exploited Children. After receiving reports, NCMEC tries to find the people who sent or received the material and, if possible, the victims. These reports are then sent to law enforcement.
While the sheer volume of CyberTipline reports exceeds law enforcement, researchers say the volume is just one of several core problems with the system. For example, many reports sent by technology companies — such as Google, Amazon and Meta — lack important details, such as sufficient information about the identity of the perpetrator, according to the report. This makes it difficult for law enforcement to know which reports to prioritize.
“There are significant problems with the whole system right now, and those cracks are becoming chasms in a world where artificial intelligence is producing brand new CSAMs,” said Alex Stamos, using the initials for the child sexual abuse material. Stamos is a Stanford lecturer and cybersecurity expert.
The system lags behind technologically and is plagued by a persistent challenge among government and nonprofit tech platforms: a lack of highly skilled engineers who can be paid much higher salaries in the tech industry. Sometimes these workers are even poached by the same companies that send the reports.
Then there are legal restrictions. According to the report, court rulings have led NCMEC staff to stop reviewing some files (for example, if they are not publicly available) before sending them to law enforcement. Many law enforcement agencies believe they need a search warrant to access such images, which slows down the process. Sometimes multiple warrants or subpoenas are needed to identify the same offender.
The system is also easy to disperse. The report reveals that NCMEC recently reached the milestone of one million reports in one day due to a meme that went viral on multiple platforms – which some found funny and others shared with outrage.
“That day actually led them to make some changes,” Stamos said. “It took them weeks to get through that backlog” by making it easier to merge images together.
CyberTipline received more than 36 million reports in 2023, almost all from online platforms. Facebook, Instagram and Google sent the most reports. The total number has increased dramatically.
Nearly half of the tips submitted last year were actionable, meaning NCMEC and law enforcement could follow up.
Hundreds of reports were about the same offender, and many contained multiple images or videos. About 92% of reports made in 2023 involved countries outside the United States, a big change from 2008, when the majority involved victims or perpetrators in the United States.
Some are false alarms. “It drives law enforcement crazy when they get these reports that they think are definitely adult,” Grossman told reporters. “But the system encourages platforms to be very conservative or report potentially limited content, because if it’s found to have been CSAM and they knew about it and didn’t report it, they could face fines.”
One relatively easy fix proposed in the report would improve the way tech platforms tag what they report to distinguish widely shared memes from something that deserves closer scrutiny.
Stanford researchers interviewed 66 people who were involved in CyberTipLine, from law enforcement to NCMEC staff and employees of the online platform.
NCMEC said it looked forward to “exploring the recommendations internally and with key stakeholders”.
“Over the years, the complexity of reports and the seriousness of crimes against children are constantly developing. Therefore, leveraging new technological solutions throughout the CyberTipline process will result in more children being protected and offenders held accountable,” it said in a statement.
Among the report’s other findings:
— The CyberTipline reporting form does not have its own field for sending chat-related material such as sexual pain messages. The FBI recently warned of a “huge increase” in the number of child sex extortion cases – including financial sex extortion, where someone threatens to publish risqué images unless the victim pays.
– Police detectives told the Stanford researchers that they are having trouble getting higher-ups to prioritize these crimes, even though they have presented them with detailed written descriptions to emphasize their seriousness. “They cringe when they read it, and they really don’t want to think about this,” Grossman said.
— Many law enforcement officials said they were unable to fully investigate all reports due to time and resource constraints. One detective can be responsible for 2000 reports a year.
– Outside the United States, especially in poorer countries, the challenges of reporting child abuse are particularly serious. Law enforcement may not have reliable Internet connections, “proper computers” or even gas for cars to execute search warrants.
– A bill passed by the US Senate in December would require online platforms to report child sex trafficking and online solicitation to CyberTipline and give law enforcement more time to investigate child sexual abuse. Currently, the hotline does not offer simple ways to report suspected sex trafficking.
While some advocates have proposed more intrusive surveillance laws to catch abusers, Stamos, the former head of security at Facebook and Yahoo, said they should try simpler fixes first.
“There is no need to violate users’ privacy if you want to put more pedophiles in prison. They’re sitting there,” Stamos said. “The system doesn’t work very well to take the information that’s out there right now and then turn it into charges.”