Meta will soon prevent children under 16 from livestreaming on Instagram unless their parents explicitly approve.
The new safety rule is part of broader efforts to protect young users online and will first be introduced in the UK, US, Canada and Australia, before being extended to the rest of Europe and beyond in the coming months.
The company explained that teenagers under 16 will also need parental permission to disable a feature that automatically blurs images suspected of containing nudity in direct messages.
These updates build on Meta's teen supervision programme introduced last September, which gives parents more control over how their children use Instagram.
Instead of limiting the changes to Instagram alone, Meta is now extending similar protections to Facebook and Messenger.
Teen accounts on those platforms will be set to private by default, and will automatically block messages from strangers, reduce exposure to violent or sensitive content, and include reminders to take breaks after an hour of use. Notifications will also pause during usual bedtime hours.
Meta said these safety tools are already being used across at least 54 million teen accounts. The company claims the new measures will better support teenagers and parents alike in making social media use safer and more intentional, instead of leaving young users unprotected or unsupervised online.