Facebook has misled parents, failed to protect children’s privacy: US regulators
U.S. regulators say Facebook has misled parents and failed to protect the privacy of children using its Messenger Kids app, including by misrepresenting app developers’ access to private user data.
As a result, the Federal Trade Commission on Wednesday proposed sweeping changes to Facebook’s 2020 Privacy Rule (now Meta) that would prohibit it from profiting from data it collects on users under 18. This would include data collected through its virtual reality. Products. The FTC said the company has not fully complied with the 2020 order.
Meta would also be subject to other restrictions, including its use of facial recognition technology, and its obligation to provide its users with additional privacy protections.
“Facebook has repeatedly broken its privacy promises,” said Samuel Levine, director of the FTC’s Office of Consumer Protection. “The company’s recklessness has put young users at risk, and Facebook must be held accountable for its failures.”
Meta called the announcement a “political stunt.”
“Despite three years of continuous engagement with the FTC around our agreement, they have not offered an opportunity to discuss this new, completely unprecedented theory. Let’s be clear what the FTC is trying to do: usurp Congress’s authority to set industry-wide standards and choose instead one American company while the Chinese companies like TikTok can operate without restrictions on American soil,” Meta said. prepared statement. “We have devoted enormous resources to building and implementing an industry-leading privacy program in accordance with the terms of our FTC settlement. We will vigorously fight this action and expect to prevail.”
Facebook launched Messenger Kids in 2017, pitching it as a way for kids to chat with family members and friends approved by their parents. The app does not give children separate Facebook or Messenger accounts. Instead, it acts as an extension of the parent’s account, giving parents control such as the ability to decide who their children can chat with.
At the time, Facebook said Messenger Kids does not serve ads or collect data for marketing purposes, although it does collect data it said is necessary to use the service.
But child development experts raised immediate concerns.
In early 2018, a group of 100 experts, lawyers and parenting organizations disputed Facebook’s claims that the app fulfilled children’s need for a messaging service. The group included non-profit organizations, psychiatrists, pediatricians, educators and children’s music singer Raffi Cavoukian.
“Messenger Kids doesn’t meet a need — it creates one,” the letter said. “It primarily appeals to kids who wouldn’t otherwise have their own social media accounts.” Another point criticizes Facebook with a “new product for younger kids”.
Facebook responded to the letter at the time saying that the app “helps parents and children communicate in a safer way” and stressed that parents are “always in control” of their children’s activities.
The FTC now says that hasn’t happened. The 2020 privacy regulation that ordered Facebook to pay a $5 billion fine called for an independent evaluator to evaluate the company’s privacy practices. The FTC said the reviewer “identified several gaps and weaknesses in Facebook’s privacy program.”
The FTC also said that between late 2017 and 2019, Facebook misrepresented that parents could control who their children communicated with through the Messenger Kids product.
“Despite the company’s promises that children using Messenger Kids could communicate only with contacts approved by their parents, under certain circumstances children were able to communicate with unapproved contacts in group text chats and group video calls,” the FTC said.
As part of the proposed changes to the FTC’s 2020 order, Meta would also have to suspend the launch of new products and services without “written confirmation from an evaluator that its privacy program is fully compliant.”
Read all the Latest Tech News here.