Meta announced today that it is introducing one of its most significant privacy updates to date regarding the protection of teen users.
New measures hide self-harm, graphic violence, eating disorder content from teens on Instagram and Facebook and other harmful topics. Related content will now be banned for users under 16 in Feeds and Stories, even if it’s shared by an account they’re following. When teens search for these topics, they are redirected to “expert resources” instead. The company said it consulted with experts on teen development when determining the type of content to block.
Besides, it will The goal is to automatically assign existing teen users to the strictest screening settings, which is an extension of the previous update that only included new users in this category. These users will not be able to opt out of these settings on Instagram or Facebook.
The social media giant is also rolling out new prompts prompting teenagers to update their privacy to “turn on recommended settings”. This will automatically limit who can retweet their content and tag or mention them. It also prevents unfollowers from messaging teens and hides offensive comments.
Today’s update significantly limits what minors can access, following a series of recent lawsuits against Meta. These include a complaint filed by 41 states that they accuse Meta of harming the mental health of the youngest usersanother complaint filed by Seattle schools over “mental health crisis” of youth and the recent decision that social media companies will be forced to defend against juvenile addiction lawsuits.