Meta is taking another step toward enhancing online safety for younger users by rolling out upgraded protections on Instagram Teen Accounts and expanding the feature to Facebook and Messenger. Building on the Teen Account framework first launched in India on February 11, 2025, the company has now introduced stricter controls, including limiting Instagram Live access and sensitive content visibility, unless parents provide explicit permission.
Under the latest update, teens under 16 will no longer be able to go Live on Instagram or disable the feature that blurs images with suspected nudity in Direct Messages without parental approval. These new protections will be rolled out over the next few months and are intended to give parents greater control and confidence over how their teens engage with social media. In a significant expansion move, Teen Accounts will now also be made available on Facebook and Messenger, starting with users in the US, UK, Australia, and Canada, with other regions to follow soon.
Teen Accounts, which were introduced last year, automatically place users under 16 into a restricted experience designed to protect them from inappropriate content and unwanted interactions. According to Meta, these built-in protections have been embraced widely—97% of teens aged 13 to 15 have retained the default restrictions, while 94% of US parents surveyed by Ipsos found the Teen Account model helpful.
These accounts come with a suite of features aimed at creating a safer and more age-appropriate environment. All teen accounts are automatically set to private, meaning only approved followers can view content, and messaging is limited to people the teen already follows or is connected with. Content controls are set to the strictest level by default, shielding teens from exposure to sensitive content such as physical altercations or promotional content for cosmetic procedures. Teens are also protected from excessive tagging and mentioning, and offensive language is filtered out through the pre-enabled Hidden Words feature.
Meta’s protective measures go beyond content filtering. Time management tools like time limit reminders and sleep mode—automatically active between 10 PM and 7 AM—aim to help teens moderate their screen time. Notifications are muted during sleep hours, and teens receive alerts prompting them to exit the app after 60 minutes of usage.
For parents, Meta has introduced a robust set of supervision tools to complement the default restrictions. Parents can view who their teen has messaged in the past seven days (without access to message content), set daily time limits, and block Instagram access during specific hours, such as nighttime or school hours. If teens attempt to change settings to be less restrictive, parental approval is required. Even for older teens aged 16 and above, parents can enable supervision manually. Soon, Meta will allow parents to directly adjust settings to apply stricter controls.
The foundation for these changes was laid earlier this year, when Meta launched Instagram Teen Accounts in India in February 2025. At the time, Natasha Jog, Director of Public Policy India at Instagram, emphasized the company’s focus on creating a responsible digital space: “With the expansion of Instagram Teen Accounts to India, we are strengthening protections, enhancing content controls, and empowering parents, while ensuring a safer experience for teens.”
According to Meta, there are now at least 54 million active Teen Accounts globally. A key part of their approach has been to gather continuous feedback from parents. A survey by Ipsos revealed that 85% of parents feel Teen Accounts make it easier for them to guide their teens toward positive experiences on Instagram, and more than 90% find the default protection settings helpful.
In addition to parental controls and safety measures, Meta has focused on strengthening age verification processes. This includes introducing additional checks when users try to register with an adult birthdate, to prevent underage users from bypassing restrictions. These steps are designed to ensure the right users receive the appropriate safety settings. Instagram will also continue enforcing Community Guidelines rigorously, with a stronger emphasis on hiding sensitive or harmful content—even from people users follow—if it's not deemed suitable for teen audiences.
Mansi Zaveri, Founder & CEO of Kidsstoppress.com, lauded the platform’s direction, saying, “Teen safety online is a growing concern for parents, and Instagram’s new Teen Accounts feature is a step in the right direction. Strengthening privacy settings, limiting unwanted interactions, and adding parental supervision tools can help create a safer space for young users.”
Echoing this sentiment, Uma Subramanian, Co-Founder & Director of the RATI Foundation, noted, “The recent updates to Instagram’s teen accounts introduce an additional layer of security and represent an incremental step toward strengthening online safety on Meta platforms. Balancing teen autonomy and online safety remains a complex challenge, and this initiative is a move in that direction.”
Meta’s efforts signal a broader industry shift toward designing platform experiences that prioritize teen safety while still allowing space for self-expression and social interaction.