Online Safety Laws Toughened: Blocking Self-Harm Content Becomes a Legal Duty

 
09/09/2025
5 min read

The government has announced new measures that will require social media platforms to proactively block self-harm content under the Online Safety Act.

This change closes a significant gap in the law. Until now, platforms were obliged to stop content encouraging suicide, but not content promoting self-harm. With today’s announcement, the rules are being extended to ensure both types of material are treated as priority harms.

Why the change matters

The decision follows mounting pressure from campaigners, child protection experts, and families affected by online harms.

The National Crime Agency (NCA) recently warned of an “unprecedented risk” posed to children by so-called “Com networks” — online groups that groom children into self-harm and sexual abuse.

Unlike traditional harmful content that children might stumble upon, these networks actively coerce and encourage young people to harm themselves. Some even incite children to share images of their injuries as part of cruel competitions.

The scale of the problem has been described as a “tsunami” of toxic content that continues to reach children, despite the introduction of the Online Safety Act earlier this year.

What the new rules say

Under the strengthened provisions:

  • Platforms must prevent publication of self-harm content before it appears online, not just remove it afterwards.
     
  • The same proactive duty already in place for suicide content is now extended to cover self-harm.
     
  • Companies that fail to block this material face enforcement by Ofcom, including fines of up to £18 million or 10% of global annual turnover.
     

Technology Secretary Liz Kendall made clear that this is not optional:

“Vile content that promotes self-harm continues to be pushed on social media and can mean potentially heart-wrenching consequences for families across the country.

Our enhanced protections will make clear to social media companies that taking immediate steps to keep users safe from toxic material that could be the difference between life and death is not an option, but the law.”

The Molly Russell case: catalyst for reform

The announcement was welcomed by the Molly Rose Foundation, a suicide prevention charity named after Molly Russell, the 14-year-old schoolgirl who died in 2017 after being exposed to harmful content online.

An inquest into her death found that social media content on self-harm, suicide, and depression had contributed to her actions.

The charity has long campaigned for self-harm to be recognised as a “priority harm” under the Online Safety Act. Today’s announcement represents the fulfilment of that call.

Foundation CEO Andy Burrows said:

“Coercing and grooming young people to harm themselves is now at the frontline of self-harm risks online and presents a growing and sadistic threat to children.

Molly Rose Foundation has long called for self-harm offences to be considered a priority harm under the Online Safety Act so we strongly welcome the government’s action in the face of this rapidly increasing threat.”

Enforcement and regulation

With this change, Ofcom’s role expands further. The regulator must now ensure platforms:

  • Deploy robust content moderation systems capable of detecting and blocking self-harm content before publication.
     
  • Invest in AI-driven tools and human moderation teams to identify covert or coded language used in grooming networks.
     
  • Balance these duties with freedom of expression obligations, ensuring lawful discussion of mental health support is not wrongly censored.
     

This last point is critical. The law is not intended to block legitimate resources for young people struggling with mental health, such as NHS advice pages or support group forums. Instead, it targets content designed to promote, encourage, or glamorise self-harm.

Timeline

  • March 2025: The NCA highlights “Com networks” as an urgent and unprecedented online risk.
     
  • August 2025: Online Safety Act comes into force, introducing duties around suicide, pornography, and eating disorders.
     
  • September 2025: Government announces extension of proactive blocking to self-harm content.
     
  • Autumn 2025: New rules expected to come into force, giving Ofcom enforcement powers.

Reactions from child safety organisations

The change has been broadly welcomed across the child protection sector.

  • NSPCC said the decision reflects the “reality of online risks” and that platforms must be held accountable for the environments they create.
     
  • Barnardo’s described the amendment as “a critical safeguard” that closes a dangerous loophole in the law.
     
  • Internet Matters urged Ofcom to ensure strong enforcement, noting that children still report 3 in 4 experiencing harm online.

What this means for social media platforms

For tech companies, the strengthened rules raise the bar again:

  • Technical obligations: Platforms will need to expand detection systems beyond suicide content to cover a broader range of harmful material.
     
  • Algorithm redesign: Recommendation systems must be reconfigured to avoid surfacing harmful content in feeds.
     
  • Compliance burden: Platforms face steep fines and reputational damage if they fail to comply.
     

This is part of a broader trend of governments worldwide imposing legal duties of care on online services, rather than leaving moderation to voluntary codes.

The legal balance: protection vs expression

Lawyers and regulators face a difficult challenge: preventing harmful content without suppressing legitimate discussions of mental health.

For example:

  • Content encouraging self-harm must be blocked.
     
  • Content seeking support or raising awareness must remain available.
     

The distinction can be subtle and context-specific. Ofcom is expected to issue codes of practice setting out how platforms should strike this balance.

Support for families

For families affected by these issues, today’s announcement offers reassurance that the law is evolving in step with new risks.

But legal protections alone cannot eliminate danger. Parents, schools, and communities still play a vital role in:

  • Talking openly with children about their online experiences.
     
  • Recognising warning signs of grooming or self-harm risks.
     
  • Directing children to safe resources such as the NHS or Samaritans.

Final thoughts

The government has signalled that online safety rules will continue to evolve as risks change. By bringing self-harm content into the same legal category as suicide encouragement, the UK has closed a dangerous loophole and strengthened protections for children.

For tech companies, the message is stark: blocking harmful content is now a legal requirement, not a voluntary measure.

For families, the hope is that no more children will suffer the same fate as Molly Russell, whose story has been the driving force behind years of campaigning for change.

Contact us online 

Related Reading:

2025 Standard Crime Contract: Welcome Packs, Key Changes, and Resources for Providers

Civil News: Legal Aid Opportunities in Housing and Debt

Best Small Business and Start-Up Grants and Loans in the UK