Meta Platforms announced on Tuesday its intention to conceal more content from teenagers on Instagram and Facebook, responding to global regulatory pressures urging the social media giant to safeguard children from harmful content on its platforms.
All teenagers will now be placed in the most restrictive content control settings on the apps, with additional limitations on search terms on Instagram, according to Meta’s blog post.
The changes aim to make it more challenging for teenagers to encounter sensitive content like suicide, self-harm, and eating disorders when utilizing features such as Search and Explore on Instagram, Meta stated.
The company expects to implement these measures over the next few weeks to provide a more “age-appropriate” experience for users.
Meta faces regulatory challenges in both the United States and Europe, with accusations that its apps are addictive and contribute to a youth mental health crisis.
In October, attorneys general from 33 U.S. states, including California and New York, filed a lawsuit against the company, alleging repeated misleading of the public about the dangers of its platforms.
In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.
This increased regulatory scrutiny follows the testimony of a former Meta employee in the U.S. Senate, claiming that the company was aware of harassment and other harms faced by teens on its platforms but failed to address them.
The employee, Arturo Bejar, criticized Meta’s changes, stating that the company relied on “‘grade your own homework’ definitions of harm” and did not provide an easy way for teens to report unwanted advances.
The competition between Meta and TikTok for young users has intensified in recent years, with Facebook’s usage among teens declining. According to a 2023 Pew Research Center survey, 63% of U.S. teens reported using TikTok, 59% used Instagram, while only 33% used Facebook.