Ofcom warns major platforms to enforce age limits and strengthen child safety measures
The UK’s communications regulator has ordered major social media and gaming platforms to demonstrate how they will better protect children online, warning that enforcement action could follow if progress is not made.
The UK communications regulator Ofcom has instructed major technology platforms to strengthen protections for children and ensure that minimum age rules are properly enforced on their services. The warning applies to widely used platforms including Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube, which have until 30 April 2026 to explain what actions they will take.
The move comes as Ofcom continues implementing the Online Safety Act, which introduced new legal obligations for digital platforms to reduce online harms. According to the regulator, investigations covering nearly one hundred online services since the law entered into force have already led to enforcement actions, changes to disrupt the sharing of child sexual abuse material, and the introduction of stronger age checks on many adult-content websites.
Despite these developments, Ofcom says the technology sector has not done enough to ensure children’s online protection. Research cited by the regulator indicates that 72% of children aged between eight and twelve use platforms that officially set their minimum age at thirteen, suggesting that age rules are widely ignored.
In a formal request to the largest platforms used by children, Ofcom outlined four areas where companies must take further action.
First, services must implement effective age verification or age assurance systems to prevent children from easily bypassing minimum age restrictions. While the Online Safety Act does not explicitly require age checks on social media platforms, Ofcom says they are necessary to enforce existing age policies.
Second, platforms must introduce stronger safeguards against online grooming, including preventing unknown adults from contacting children and ensuring that age controls apply to both children and adults using the services.
Third, companies must ensure safer algorithmic feeds for children. Recommendation systems that automatically suggest videos, posts or other content are identified by Ofcom as a major pathway through which children encounter harmful material. The regulator has issued legally binding information requests to examine how these algorithms operate.
Fourth, Ofcom says companies must stop testing new digital products or artificial intelligence tools on children without adequate safety assessments. Platforms will be expected to demonstrate that they have evaluated the risks of major product updates before launching them.
The regulator plans to publish a report in May 2026 assessing how companies responded to the request. If Ofcom finds that the measures proposed by platforms are insufficient, it has indicated it will consider formal enforcement actions or stricter regulatory requirements.
Ofcom chief executive Dame Melanie Dawes said the regulator is concerned about a gap between companies’ private commitments and the protections visible to users and parents.
Child protection organisations have welcomed the intervention. The NSPCC, a UK child protection charity, said stronger age enforcement and transparency are necessary to prevent harmful or addictive content from reaching children online.
The debate over child safety on digital platforms continues in parallel with government discussions about additional legislative measures. The regulator’s findings later this year are expected to influence how the UK further develops its approach to online safety, algorithmic accountability and age assurance technologies.
