Meta has announced new measures to restrict access to age-inappropriate content for teenagers on its popular social media platforms, Instagram and Facebook. The company is automating the process of setting the most restrictive content controls for teenage users.
As detailed in a recent blog post, Meta is expanding its existing policy, which was initially applied to new teenage users, to include those already active on Instagram and Facebook. This change aligns with expert guidance on digital safety for minors.
The update introduces stricter ‘Sensitive Content Control’ on Instagram and ‘Reduce’ feature on Facebook. These controls are designed to limit exposure to potentially sensitive content in various sections of the platforms, including Search and Explore.
Additionally, Meta is taking a proactive stance against content related to suicide, self-harm, and eating disorders. When users search for these topics, the platforms will not only hide the results but also guide them towards expert resources for assistance.
This comprehensive update, which will be implemented over the next few weeks, aims to create a safer and more age-appropriate online environment for teenage users.
Meta is also focusing on encouraging teens to regularly review and update their safety and privacy settings. They are introducing notifications that prompt users to select more private settings with ease. Selecting the “Turn on recommended settings” option will automatically adjust their account settings to enhance privacy. These adjustments include restrictions on content reposting, tagging, mentions, and message accessibility, as well as hiding offensive comments.