Ofcom finalises rules for tech firms to protect children online

May Be Interested In:Josh Duhamel, 52, addresses 21-year age gap with wife: ‘She’s more mature than I am’


The final version of rules the regulator says will offer children in the UK “transformational new protections” online have been published.

Sites will have to change the algorithms that recommend content to young people and introduce beefed up age checks by 25 July or face big fines.

Platforms which host pornography, or offer content which encourages self-harm, suicide or eating disorders are among those which must take more robust action to prevent children accessing their content.

Ofcom boss Dame Melanie Dawes said it was a “gamechanger” but critics say the restrictions do not go far enough and were “a bitter pill to swallow”.

Ian Russell, chairman of the Molly Rose Foundation, which was set up in memory of his daughter, who took her own life aged 14, said he was “dismayed by the lack of ambition” in the codes.

But Dame Melanie told BBC Radio 4’s Today programme that age checks were a first step as “unless you know where children are, you can’t give them a different experience to adults.

“There is never anything on the internet or in real life that is fool proof… [but] this represents a gamechanger.”

She admitted that while she was “under no illusions” that some companies “simply either don’t get it or don’t want to”, the Codes were UK law.

“If they want to serve the British public and if they want the privilege in particular in offering their services to under 18s, then they are going to need to change the way those services operate.”

Prof Victoria Baines, a former safety officer at Facebook told the BBC it is “a step in the right direction”.

Talking to the Today Programme, she said: “Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they’re putting people behind it.”

Under the Codes, algorithms must also be configured to filter out harmful content from children’s feeds and recommendations.

As well as the age checks, there will also be more streamlined reporting and complaints systems, and platforms will be required to take faster action in assessing and tackling harmful content when they are made aware if it.

All platforms must also have a “named person accountable for children’s safety”, and the management of risk to children should be reviewed annually by a senior body.

If companies fail to abide by the regulations put to them by 24 July, Ofcom said it has “the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.”

The new rules are subject to parliamentary approval under the Online Safety Act.

share Share facebook pinterest whatsapp x print

Similar Content

Overhead shot of three pairs of hands, each holding small green plants in soil.
What is Earth Day, when is it and what has it achieved?
NCIS' Eric Christian Olsen's shattering news as family home destroyed in wildfires
NCIS’ Eric Christian Olsen’s shattering news as family home destroyed in wildfires
Here's the Wordle word that broke a whopping 5.6m streaks in 2024
Here’s the Wordle word that broke a whopping 5.6m streaks in 2024
Blake Lively and Justin Baldoni: What you need to know
Blake Lively and Justin Baldoni: What you need to know
Marvel Rivals Season 1 will nerf some of the most dominant characters
Marvel Rivals Season 1 will nerf some of the most dominant characters
Servify shoots for unicorn tag with $100 million fundraise
Servify shoots for unicorn tag with $100 million fundraise
Beyond the News: The Stories Behind the Headlines | © 2025 | Daily News