Meta Announces Enhanced Safety Features for Young Users
In an effort to better safeguard teenage users, Meta is rolling out new safety settings across their Facebook and Instagram platforms. These changes are designed specifically for users under the age of 18.
Cameron Jordan
- 2024-01-14
- Updated 03:19 PM ET
(NewsNibs) - Meta, the parent company of Facebook and Instagram, has stated that they will be implementing a series of new safety settings to protect young users on their social media platforms. These settings will particularly target content related to self-harm and eating disorders, making such posts less visible to teens. To further these efforts, Meta has developed over 30 tools and resources aimed at supporting not just the teens but also their parents in navigating the platforms safely.
Enhanced Privacy and Search Restrictions
As part of the new implementations, all teenage users on Instagram and Facebook will be automatically set to the most restrictive content control settings. This move comes in a bid to provide teens with safe and age-appropriate experiences online. Additionally, Instagram plans on restricting even more search terms that could potentially lead to harmful content. In a proactive step, Instagram will also prompt teenagers to review their privacy settings through a new single-tap notification feature, aiming to encourage users to maintain control over their account privacy.
Following Public Criticism and Legal Challenges
These changes occur in the wake of substantial public scrutiny and criticism towards Meta over fears that their platforms may affect the mental well-being of younger users. Revelations emerged as former Facebook employee Arturo Bejar testified before a Senate subcommittee, claiming that CEO Mark Zuckerberg and other executives had ignored warnings about the platforms' negative impact on teenagers, including issues such as sexual harassment by strangers. It has also come to light that Zuckerberg may have impeded initiatives aimed at teen user welfare. Moreover, Meta has faced lawsuits alleging failure to deactivate accounts belonging to children under 13, and more recently, another lawsuit accused the company of providing an environment for child predators. The rollout of these new safety measures for users under 18 is anticipated to take place over the next few months.
Meta's commitmment to online safety for teens
These new settings underscore Meta's commitment to creating a safer experience for teenage users on its networks. By reinforcing the default privacy setting for teens and expanding resources catered to these users and their parents, the company seeks to address past criticisms and adapt to the growing demand for accountable social media practices that prioritize user well-being. The technology giant's initiative appears to be a step forward in mitigating the exposure of young individuals to potentially harmful content and interactions online.