UK to toughen online safety laws to protect people from self-harm content
The UK will amend the Online Safety Act to make self-harm content a priority offence. Tech companies will be legally required to detect and remove harmful material before it reaches users, with Ofcom overseeing enforcement.

The UK government has announced that it will strengthen the Online Safety Act by classifying online content that encourages or assists serious self-harm as a ‘priority offence.’ The move is intended to ensure that both children and adults are better protected from harmful material online.
What the change means
Until now, online platforms have had clear legal duties to protect children from dangerous self-harm content. Under the new regulations, these protections will extend to users of all ages, recognising that adults with mental health challenges can also be vulnerable.
By designating self-harm content as a priority offence, the government will require tech companies to go beyond reacting when harmful material is reported. Instead, platforms must proactively use technology to detect and remove content before it reaches users. Companies that fail to act will face tougher enforcement measures.
What is a priority offence?
Under the Online Safety Act, a priority offence is a category of harmful content that platforms are legally required to actively prevent and remove, not just respond to after a user reports it. This means companies must design systems, policies, and detection tools to stop such material from appearing in the first place. Other examples of priority offences include terrorism content and child sexual abuse material. Adding self-harm content to this list brings it under the strictest possible safeguards.
Government and expert views
Technology Secretary Liz Kendall said the changes underline the government’s determination to tackle harmful online material:
‘Vile content that promotes self-harm continues to be pushed on social media and can mean potentially heart-wrenching consequences for families across the country. Our enhanced protections will make clear to social media companies that taking immediate steps to keep users safe … is not an option, but the law.’
The charity Samaritans welcomed the reform. Chief Executive Julie Bentley said it was crucial to close gaps in protection:
‘While the internet can be a source of support for people who are struggling, damaging suicide and self-harm content can cost people their lives. It’s therefore vital that government continues to take opportunities to strengthen the Act … so we can save more lives lost to suicide.’
Next steps
The new rules will take effect 21 days after they are approved by both Houses of Parliament. A Statutory Instrument to amend the law is expected to be laid in autumn 2025.
Once in force, the strengthened law will give regulator Ofcom greater powers to hold platforms accountable and require them to demonstrate that they are actively preventing the spread of harmful self-harm material.