Google, Microsoft, Meta, and Apple have all previously faced lawsuits not protecting the interest of the underage users that use their platform. (Pexels)News 

Tech Giants Microsoft, Google, and Meta Sued Over Alleged Child Exploitation

More than 30 US states, including 8 individual states, have filed a lawsuit against Meta on Tuesday. The lawsuit accuses Meta of using features in Instagram and Facebook to attract children to their platform and expose them to harmful content. This case highlights the ongoing problem of companies exploiting vulnerable children for financial gain. However, Meta is not the sole culprit in this issue. Over the years, various tech companies such as Google, Microsoft, Apple, and others have also faced similar lawsuits for their failure to safeguard underage users.

Protecting Children Online: The Premise

So if you’ve come across numerous news articles about Meta’s lawsuit, you might be curious to know why children need to be protected online and what dangers they’re exposed to. The answer is a bit complicated.

Unlike in real life, where dangers are visible and often harm a person’s physical well-being, in the digital space there are invisible threats that harm a person’s emotional and mental well-being.

For example, one of Meta’s accusations is that its algorithm promotes content harmful to children on Facebook and Instagram. This content can be sexual or violent in nature, affecting the psychology of a growing child, but it can be much more subconscious. Although these platforms promote age-appropriate but addictive content, it can have harmful effects. One study found that teens who regularly use social media before bed are 53 percent more likely to have poor sleep quality. It also found that “Increased use of social media correlates with emergency room visits for mental health problems, including depression, addiction, and anxiety.”

Because even minor triggers can affect the still-developing minds of children, many non-profit organizations and governments believe that higher levels of vigilance and protection must be available to underage users.

So who is to blame?

It turns out that everything. Most social media platforms, as well as companies that make age-agnostic consumer products, have faced either complaints, petitions or lawsuits over the issue. The following are some of the most notable of recent years, excluding the ongoing Meta trial.

Meta in 2021: In March 2021, Russia filed several cases against Facebook, Twitter, Google, TikTok, and Telegram, following protests in Russia over the arrest of Alexei Navalny. All the companies allegedly failed to remove messages urging children to join protests.

The Meta Whistleblower Case in 2021: In 2021, former Meta employee Frances Haugen came forward as a whistleblower and highlighted in internal workings documents that the company was targeting its younger user base for profit. On Instagram, Haugen revealed an internal study that found evidence that many teenage girls who used the photo-sharing app were suffering from depression and anxiety over body image issues. Haugen’s testimony to Congress is cited in Tuesday’s complaint.

Microsoft in 2002: In early 2002, Microsoft proposed to settle private lawsuits by donating $1 billion in cash, software, services and training, including Windows licenses and refurbished computers, to approximately 12,500 underserved public schools. The judge saw this as a possible surprise for Microsoft, not only in educating school children in Microsoft solutions, but also in flooding the market with Microsoft products.

Twitter (now X) in 2023: This lawsuit alleged that Twitter knowingly controlled and widely disseminated child sexual abuse material of two 13-year-old boys that was viewed, shared and downloaded hundreds of thousands of times through the Twitter platform.

Google in 2019: Google and its subsidiary YouTube paid $170 million following allegations by the Federal Trade Commission and the New York Attorney General that YouTube illegally collected personal information from children without their parents’ consent.

Google in 2014: In 2014, a parent filed a class-action lawsuit against Google over “in-app” purchases, which are microtransactions that can be made inside apps. The parent argued that there is a 30-minute period during which authorizations can be made for credit card purchases designed to entice children to make such purchases in “free apps” and that Google should have been aware of the problem.

Google in 2006: The US Department of Justice tried to force Google to turn over more than a million web addresses from the company’s database and a week’s worth of search engine queries without personal information. The request was to help combat Internet pornography and fend off legal challenges against the country’s Children’s Online Safety Act.

Apple in 2011: In 2011, five parents filed a class-action lawsuit against Apple over “in-app” purchases, which are purchases made within apps. The parents claimed that Apple had failed to disclose that “free” apps used by children could collect payments without the parent’s knowledge.

These lawsuits prove that tech platforms need to improve the protection they offer to users under the age of 18 and ensure that psychological problems such as depression, anxiety, eating disorders and the like do not spread due to the content displayed. for children.

Related posts

Leave a Comment