Meta Platforms (NASDAQ:META) has announced plans to enhance content restrictions for teens on Instagram and Facebook in response to global regulatory pressure aimed at protecting children from harmful content. In a blog post on Tuesday, Meta revealed that all teens will now be subjected to the most restrictive content control settings on the apps, with additional limitations on search terms for Instagram.
The upcoming changes aim to make it more challenging for teens to encounter sensitive content, such as suicide, self-harm, and eating disorders, particularly when using features like Search and Explore on Instagram. Meta emphasized that these measures, scheduled for rollout in the coming weeks, are designed to create a more “age-appropriate” experience for users.
Facing regulatory scrutiny in both the United States and Europe, Meta has been accused of contributing to a youth mental health crisis with its addictive apps. In October, attorneys general from 33 U.S. states, including California and New York, filed a lawsuit against the company, alleging repeated misinformation about the dangers associated with its platforms.
In Europe, the European Commission has sought information regarding Meta’s measures to protect children from illegal and harmful content. This regulatory pressure intensified after a former Meta employee testified in the U.S. Senate, claiming that the company was aware of harassment and other harms affecting teens on its platforms but failed to take action.
The employee urged Meta to implement design changes on Facebook and Instagram to encourage positive behaviors and provide better tools for young users to manage unpleasant experiences. While businesses have historically targeted children as consumers, hoping to secure brand loyalty as they grow up, Meta’s competition with TikTok for young users has intensified in recent years, potentially attracting more advertisers interested in reaching this demographic.
Featured Image: Unsplash @ Julio Lopez