Meta is introducing tougher safety measures for teenagers on Instagram, blocking livestreaming for under-16s unless they have parental permission.
The move comes as part of a wider plan to extend under-18 protections across Facebook and Messenger.
Teenagers under the age of 16 will no longer be able to use Instagram’s Live feature without approval from a parent or guardian.
Parental permission will also be needed if they want to disable a tool that automatically blurs images containing suspected nudity in direct messages.
The new measures accompany the rollout of teen accounts on Facebook and Messenger, a system first launched on Instagram last year.
These accounts place users under 18 into default safety settings, giving parents the ability to set daily time limits, block access at specific times, and monitor the accounts their child is interacting with.
The Facebook and Messenger teen accounts will initially launch in the UK, US, Australia, and Canada.
Users under 16 will need parental permission to adjust their settings, while those aged 16 and 17 will have more freedom to make changes on their own.
According to Meta, 54 million teenagers worldwide currently use Instagram teen accounts, with over 90% of 13 to 15-year-olds keeping their default restrictions in place.
The National Society for the Prevention of Cruelty to Children (NSPCC) welcomed the expansion of safety measures to Facebook and Messenger.
However, the charity emphasised that Meta must do more to stop harmful content from appearing on its platforms.
These updates arrive as the UK begins enforcing the new Online Safety Act.
From March, more than 100,000 websites and apps, including Facebook, Google, X, Reddit, and OnlyFans, must take action to prevent illegal material such as child sexual abuse, fraud, and terrorism from appearing online.
The legislation also requires tech companies to shield under-18s from harmful content, including material promoting suicide and self-harm.
Recent reports suggesting that the Online Safety Act could be weakened as part of a potential UK-US trade agreement have sparked strong criticism from child protection groups, who argue that any compromise would betray public trust.
At the time Instagram’s teen safety features were first launched, Meta’s then president of global affairs, Nick Clegg, stated that the company aimed to give parents greater control over how their children use social media.
Clegg also acknowledged that many parents had yet to take full advantage of available child safety tools.