Meta is changing direct message settings for teens on Instagram and Facebook. (unsplash)News 

Meta Platforms Implements Stricter Message Settings for Teens on Instagram and Facebook

In response to growing concerns from lawmakers and parents regarding harmful content, Meta Platforms Inc. is further strengthening the default direct message settings for teenagers on Instagram and Facebook. This move is part of the company’s ongoing efforts to prioritize safety on its social media platforms.

According to the blog post, the ability for teens to receive direct messages from anyone they don’t follow or connect with will be removed from Instagram, including other teens. Those under 16 in the US or under 18 in the UK and Europe can only receive messages or be added to group chats from people they are already connected to, Meta told Bloomberg via email. Teens on supervised accounts need parental approval to change the setting, which also applies to Messenger.

We are on WhatsApp channels. Click to join.

The monitoring tools were first released on Instagram in March 2022, when a whistleblower leaked internal documents that suggested Facebook was deliberately prioritizing profits over well-being and safety. The controversy led to congressional testimony and sparked debate about what could be done to protect minors online. In October 2023, over 30 US states filed a lawsuit against Meta alleging harmful youth marketing.

Meta also plans to roll out a feature to help protect teens from seeing unwanted and potentially inappropriate images in their messages from those they’re already connected to. More information on this is expected later this year.

Meta strengthens Teens’ defense on Instagram and Messenger

AFP

On Thursday, Meta began blocking messages from strangers sent directly to young teens via Instagram or Messenger.

By default, teens under the age of 16 can now only message or add to group chats with people they already follow or connect with, according to the release.

Changing the setting requires approval through “parental control tools” built into the apps, the tech company said in a blog post.

Meta added that it works to prevent teenagers from seeing unwanted or potentially inappropriate images in all direct messages.

“We’ll have more to say about this feature, which also works in encrypted chats, later this year,” Meta said.

At the beginning of this month, Meta tightened the content restrictions for teenagers on Instagram and Facebook because its platforms are harmful to young people.

The company added that this type of content includes content that deals with suicide or self-harm, as well as nudity or mentions of restricted goods.

Instagram’s restricted items include tobacco products and weapons, as well as alcohol, birth control, cosmetic procedures and weight loss programs, according to its website.

In addition, teens will now default to the most restricted settings on Instagram and Facebook, a policy that was in place for new users and is now being extended to existing users.

The changes come months after dozens of US states accused Meta of harming the mental health of children and teenagers and misleading users about the safety of its platforms.

A leaked internal investigation of Meta, including by the Wall Street Journal and whistleblower Frances Haugen, has shown that the company was long aware of the dangers its platform posed to the mental health of young people.

On the platform, teenagers are defined as under 18 years old based on the date of birth they gave when registering.

Related posts

Leave a Comment