Ofcom investigates porn sites over child protection failures under online safety act
Under the Online Safety Act, online services must ensure children cannot access pornographic content on their sites

The UK communications regulator Ofcom has launched formal investigations into two adult websites—Itai Tech Ltd, which operates the nudification platform Undress.cc, and Score Internet Group LLC, the company behind Scoreland.com. Both platforms are being scrutinised for failing to implement highly effective age assurance measures, potentially violating the Online Safety Act, which mandates strict safeguards to prevent children from accessing online pornography.
The investigations are part of Ofcom’s broader age assurance enforcement programme, introduced after the Act’s new requirements came into force in January 2025. Under these rules, any online service that publishes or displays pornographic content must take robust steps to verify users’ ages and prevent underage access.
Earlier this year, Ofcom contacted numerous services requesting detailed plans on how they would comply with the new obligations. Over 1,300 sites confirmed either full implementation or active plans to introduce effective age assurance. Some platforms opted to block UK users altogether rather than comply with the verification requirements. However, a number of services did not respond at all or failed to take meaningful action, prompting enforcement actions such as these latest investigations.
Why this matters – for users and civil society
The protection of children from harmful online content is one of the most pressing issues in today’s digital environment. Easy access to explicit material poses serious risks to children’s mental health, development, and understanding of relationships and sexuality. The Online Safety Act represents a significant step in regulating an internet that has, for too long, lacked adequate child protection standards.
For users, especially parents and guardians, this enforcement signals that regulators are actively working to make the internet a safer place for minors. It also emphasises the importance of digital literacy and parental oversight, since technological protections must go hand-in-hand with education and communication at home.
For civil society organisations, particularly those advocating for child welfare and digital rights, this is a crucial opportunity to participate in shaping enforcement practices and to ensure they remain transparent, proportionate, and effective. By engaging in consultations, submitting research, and monitoring outcomes, civil society can hold both the platforms and regulators accountable.