Facebook and Instagram have taken a significant step towards enhancing online safety for teenage users by introducing stricter controls on direct messages.

FRANCE-INTERNET-TECHNOLOGY-META

(Photo : SEBASTIEN BOZON/AFP via Getty Images)
This illustration photograph taken on October 30, 2023, shows the Meta (former Facebook) logo on a smartphone in Mulhouse, eastern France.

Blocking DMs to Teens

Meta has introduced a new layer of protection. This latest development builds upon the restrictions implemented in 2021, where Meta, the parent company of Facebook and Instagram, prohibited adults from messaging under-18 users who don't follow them on Instagram. 

Engadget reported that this protective measure is now being expanded to shield younger teens from potentially unwanted contact. 

Under the new policy, users under 16 (or 18, depending on their country) will no longer be able to receive DMs from anyone they don't follow, even if the sender is a fellow teen. 

This default setting is a crucial aspect of Meta's effort to create a safer online environment for its younger user base. The focus extends to Instagram and Messenger, with Messenger imposing additional restrictions. 

Young users on Messenger will only be permitted to receive messages from their Facebook friends or individuals in their phone contacts. For teens under parental supervision, managing these settings requires approval from their guardians. 

While this proactive approach aims to fortify online safety, it's essential to acknowledge that the effectiveness of these measures depends on the accuracy of a user's declared age and Meta's age prediction technology. 

Also Read: Social Media Officially Declared a "Toxin" by New York City Mayor

This recognition underscores the inherent challenges in creating a foolproof system for safeguarding teens from potential risks in the digital realm. As stated in its press release, Meta has emphasized its commitment to providing a secure and age-appropriate online experience for teenagers. 

Similarly, Meta revealed plans to hide content related to self-harm, graphic violence, eating disorders, and other harmful topics from teens on Instagram and Facebook. 

Under this new policy, users under the age of 16 will be shielded from encountering posts covering these sensitive subjects in their Feeds and Stories, even if shared by accounts they follow. 

In addition to content restrictions, Meta has introduced a mindfulness feature designed to send "nighttime nudges" to users under 18, encouraging them to close the app and go to bed if their scrolling exceeds 10 minutes.

Facing Legal Actions

Meta implemented these modifications in response to a wave of lawsuits and grievances concerning safeguarding its younger user base. 

The New York Times reported that the company faces legal action from 33 states, as an unsealed lawsuit alleges that Meta intentionally targeted children under 13 for its apps and websites, persistently collecting their data despite awareness of their ages. 

Based on unredacted internal Meta presentations related to the case, the Wall Street Journal reported that an estimated 100,000 child users faced daily harassment on Facebook and Instagram. 

These revelations underscore the imperative for Meta to enforce more stringent measures across its platforms.

Related Article: Meta Enhances Teen Safety on Facebook, Instagram: Expands Restrictions on Self-Harm, Eating Disorders

Written by Inno Flores

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion