From July, tech companies will be legally required to block children’s access to harmful online content or face hefty fines, under new rules announced by Ofcom as part of the Online Safety Act.
The UK communications regulator unveiled over 40 safety measures on Monday, covering websites and apps frequently used by children, including social media platforms, search engines, and online games.
Firms must implement these safety rules by 25 July or risk penalties, including fines or even being banned from operating in the UK.
Among the most significant measures, the largest and riskiest platforms will be obligated to use robust age verification tools to identify users under 18.
They must also ensure harmful content is filtered out of recommendation algorithms, and that dangerous material can be swiftly removed. Furthermore, all platforms must offer children a simple and accessible way to report harmful or abusive content.
Ofcom Chief Executive Melanie Dawes said the new rules mark a turning point for online safety in Britain.
She emphasised the importance of making digital spaces safer for young users by ensuring fewer harmful posts in their social media feeds, better protection from contact with strangers, and stronger safeguards against exposure to adult content.
Technology Secretary Peter Kyle welcomed the move and revealed the government is also considering a social media curfew for children.
His comments follow TikTok’s recent launch of a 10pm ‘wind-down’ feature for under-16s. Kyle said he is closely monitoring its effectiveness and would only pursue similar measures backed by solid evidence.
Kyle described Ofcom’s new safety code as a “watershed moment” in the fight against online harms. He stressed the need for digital platforms to ensure children can explore the online world without being subjected to harmful or dangerous content.
Online platforms will be expected to crack down on the spread of abusive, violent and hateful material, as well as online bullying. Even stricter controls will be placed on highly dangerous content relating to suicide, self-harm, eating disorders, and pornography, which must be fully removed from children’s feeds.
However, online safety campaigner Ian Russell, whose 14-year-old daughter Molly tragically died after viewing harmful online content, criticised the new rules. Russell, founder of the Molly Rose Foundation, believes the codes fall short of what is needed, accusing Ofcom of prioritising tech industry profits over children’s safety.
He expressed disappointment, stating that the guidance lacks the urgency and strength required to prevent future tragedies like his daughter’s.