Keeping Children Safe Online: Changes to the Online Safety Act Explained

The way children interact with the internet has changed dramatically over the last decade. Social media, streaming platforms, gaming sites and instant messaging apps are now a daily part of young people’s lives. Alongside the opportunities these platforms bring for learning and socialising, they also expose children to risks that would have been unimaginable a generation ago.
The Online Safety Act 2025 marks a major turning point in how the UK protects children in digital spaces. For the first time, platforms are under clear legal duties to prevent under-18s from being exposed to harmful content, while also respecting freedom of expression and users’ privacy rights.
This article explains the new protections, how they will be enforced, and what they mean for parents, children, and online service providers.
What has changed?
The Online Safety Act is now fully in force. It introduces legal duties for platforms and websites to ensure that children cannot access harmful online content. This covers material relating to:
- Pornography
- Self-harm
- Suicide
- Eating disorders
Ofcom research shows how urgent these measures are. Children as young as 8 years old have reported seeing pornography online, and nearly 1 in 6 teenagers have encountered content in the last month that stigmatises body types or promotes disordered eating.
The new law is designed to cut off this exposure by requiring platforms to adopt robust age verification systems and stronger child protection measures.
Age verification and secure access
A core feature of the Act is the introduction of mandatory age assurance. To access pornography or other harmful content, users must now prove that they are over 18. Acceptable verification methods include:
- Facial scans or facial estimation tools (which can estimate age without storing images)
- Photo ID checks
- Credit card validation
For children, this means it is now significantly harder to stumble across — or deliberately seek out — content that could cause long-term harm.
Crucially, the Act requires these systems to be secure, proportionate, and privacy-focused. Platforms cannot simply collect and store vast amounts of sensitive personal data. Instead, they must use third-party tools or privacy-by-design systems that confirm a user’s age without revealing or keeping their identity information.
The Information Commissioner’s Office (ICO) has reinforced that these services must comply with UK data protection law. Age checks must use data minimisation principles, ensuring only the bare minimum of information is collected.
Preventing harmful contact and grooming
Age assurance is only part of the picture. The Online Safety Act also obliges platforms to prevent children from being contacted by strangers.
This means:
- Children should not receive direct messages (DMs) from people they do not know.
- Platforms should not recommend accounts for children to follow or connect with, unless the accounts belong to friends or trusted contacts.
By limiting stranger contact and algorithmic suggestions, the Act aims to reduce risks of grooming, bullying, and exploitation.
VPNs and workarounds
One of the main concerns raised during consultations was whether children could bypass age checks using Virtual Private Networks (VPNs).
VPNs remain legal in the UK, but the Act places responsibility on platforms to prevent children from using them as a loophole. Companies cannot promote VPN use to children or create content encouraging minors to bypass protections.
If platforms deliberately target UK children with these workarounds, they risk heavy penalties, including fines of up to 10% of global turnover or £18 million.
The Age Verification Providers Association (AVPA) has already reported a surge of 5 million additional daily age checks across UK sites since the new law came into force.
What about legal adult content?
It’s important to note that the Online Safety Act does not ban legal adult content. Adults retain the right to access such material. The Act’s focus is purely on protecting children and creating age-appropriate online experiences.
Platforms must take a risk-based, proportionate approach to their duties. They should not arbitrarily block or remove lawful content, nor should they over-restrict adult access.
Balancing child safety with free speech
One of the more complex aspects of online regulation is ensuring that safety measures don’t undermine freedom of expression.
The Online Safety Act places equal weight on:
- Protecting children from harmful content, and
- Safeguarding free speech for adults.
The law is clear: platforms cannot use child safety duties as a pretext to censor lawful political or social debate. Only the most serious categories of harmful material — pornography, self-harm, suicide and eating disorders — require mandatory age gating.
Failure to meet either duty (protecting children or protecting free speech) exposes companies to the same penalties.
Government and regulator’s position
Technology Secretary Peter Kyle MP described the Act as:
“The most significant step forward in child safety since the internet was created.”
He emphasised that children are rarely seeking out harmful content deliberately. More often, it “finds them” through algorithms or chance exposure. Age verification, he argued, is not about restriction but about creating a safer and more positive space for children and young people.
Industry and charity responses
The Act has been widely welcomed by child protection organisations.
- NSPCC Chief Executive, Chris Sherwood praised the law as a “vehicle for significant and lasting change”, noting that algorithms are being redesigned and harmful content is being curtailed.
- Barnardo’s CEO, Lynne Perry described the protections as an “important stepping stone” that must be robustly enforced.
- Internet Matters called the changes a “milestone” but warned that Ofcom must rigorously hold platforms to account, given that 3 in 4 children aged 9–17 report harm online.
Enforcement and penalties
The law grants Ofcom significant powers to enforce compliance. Platforms that fail to meet their duties face:
- Fines up to £18 million or 10% of annual global turnover (whichever is higher).
- Potential criminal liability for senior managers who repeatedly breach obligations.
These enforcement mechanisms aim to ensure that compliance is not optional, but a cost of doing business in the UK.
Key points for parents
Parents may wonder what practical difference the Online Safety Act will make at home. Some key changes include:
- Stronger safeguards: Children should encounter fewer unsolicited harmful images or videos.
- Tighter controls: Age verification makes it harder for minors to access adult sites.
- Improved privacy: Systems are designed to protect identity and prevent data misuse.
- Safer contact rules: Children should no longer be contacted by strangers on mainstream platforms.
However, parents should still remain engaged. The law cannot eliminate all risks, and open communication between parents and children remains essential.
Legal perspective: balancing rights and risks
From a legal standpoint, the Online Safety Act illustrates the difficult balance between child protection, privacy rights, and free expression.
- For children, the law enshrines a right to safety and protection from serious harm.
- For adults, it upholds the right to access legal material and express views without undue censorship.
- For platforms, it imposes a positive legal duty to design systems with children’s welfare in mind, rather than treating safety as an afterthought.
This is a major departure from the previous, largely self-regulated model of internet safety.
Looking ahead
The Online Safety Act is not the end of the conversation. Technology evolves quickly, and regulators will need to remain alert to new risks such as:
- Emerging platforms popular with young users.
- AI-generated content, including deepfakes.
- New methods children may use to bypass safeguards.
But as of August 2025, the UK has taken a decisive step towards making the digital world safer for children.
Final thoughts
For families, this law should bring reassurance that the government and regulators are addressing the very real dangers children face online.
For online platforms, the message is equally clear: protect children or face consequences.
And for society, the Act signals a shift in how we understand the internet: not as a lawless frontier, but as a space where children deserve the same level of protection as in schools, playgrounds, and homes.
As Peter Kyle MP put it:
“Rather than looking for ways around age verification, let’s help make the internet a safer, more positive space for children — and a better experience for everyone.”
That is the aspiration behind the Online Safety Act, and one that now carries the force of law.
Related Reading:
2025 Standard Crime Contract: Welcome Packs, Key Changes, and Resources for Providers
Civil News: Legal Aid Opportunities in Housing and Debt
Best Small Business and Start-Up Grants and Loans in the UK