Watchdogs urge big tech to do more to protect children online

Ofcom has written to tech firms urging them to explain what actions they are taking on age checks and grooming protections

Watchdogs urge big tech to do more to protect children onlineAdobe Stock

Big tech has been warned by two watchdogs that it must do more to protect young people online, as it stands accused of “failing to put children’s safety at the heart of their products”.

Communications regulator Ofcom has written to Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube, giving them until the end of April to explain what actions they are taking on age checks and grooming protections.

They are also being urged to set out how they are ensuring safer online feeds for children by tackling harmful algorithms, and how they assess risk before rolling out updates on their platforms – with Ofcom calling for an “end to product testing on children”.

Alongside Ofcom’s demands, the Information Commissioner’s Office (ICO) has written to TikTok, Snapchat, Facebook, Instagram, YouTube and X – formerly Twitter, asking them to set out how their age assurance policies keep children safe.

The data regulator said there is “no excuse” not to have effective age checks in place, and that relying on children to self-declare their ages is not good enough.

ICO chief executive Paul Arnold told firms: “With ever growing public concern, the status-quo is not working and industry must do more to protect children.”

He said most services use self-declaration to identify whether children are 13 or over and that this method can be easily bypassed and is therefore ineffective.

Ofcom said that without proper protections such as strict age checks, children are being “routinely exposed to risks”.

Its chief executive, Dame Melanie Dawes, said: “These online services are household names, but they’re failing to put children’s safety at the heart of their products. There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.

“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid. That must now change quickly, or Ofcom will act.”

Ofcom said it will publicly report in May on the responses from the platforms it has contacted, and alongside this will publish new research on how much or how little children’s online experiences have changed during the first year of the Online Safety Act being in force.

The regulator said if it is not satisfied with the platforms’ responses, “we will be ready to take enforcement action” and could consider strengthening the regulatory requirements under existing industry codes “to ensure further change”.

The ICO said the firms it has contacted, which it deemed some of the “highest risk services”, must “act now to identify and implement current viable technologies to prevent children under your minimum age from accessing your service”.

The watchdog said “further regulatory action” could be taken if it feels this is necessary, adding that it “expects full cooperation” from the firms contacted.

Mr Arnold said: “Our message to platforms is simple: act today to keep children safe online. There’s now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place.

“Platforms need to be ready to demonstrate what they’re doing to keep underage children out and safeguard those children that are old enough to access their services.”

The NSPCC welcomed the call for greater transparency from tech firms as to how they are protecting children from harm, saying that “for too long, social media giants have looked the other way while harmful and addictive content floods children’s feeds, undermining their safety and wellbeing”.

Its chief executive, Chris Sherwood, said: “We’ve long called for minimum age limits to be properly enforced on social media, so it’s encouraging to see Ofcom confront this head on. Platforms must finally know who is using their services so that they can stop children accessing spaces that were never designed for them.

“As an urgent priority, Government must now give Ofcom the full powers it needs to enforce these effective age checks on young users, rein in dangerous algorithms, and finally hold tech companies to account when they fail to keep children safe.”

While new laws under the Online Safety Act came into force in August 2025 to protect under-18s from harmful online content, age checks specifically on under-13s are not explicitly required by the act.

The NSPCC said this means Ofcom is not empowered to hold services which state that they have minimum age policies of 13 accountable for enforcing them.

The Molly Rose Foundation, welcomed Ofcom “turning up the heat on reckless tech firms and their dangerous products which continue to cause daily harm to children” and said the regulator must “follow up by showing its teeth with enforcement action if companies fail to satisfy that they are taking the safety and wellbeing of young people seriously”.

Andy Burrows, chief executive of the foundation which was set up in memory of 14-year-old Molly – who took her own life after viewing harmful content on social media, said: “Parents overwhelmingly want regulation to succeed and will be heartened to see Ofcom take action to compel tech companies to better protect children from harmful feeds and ensure the Online Safety Act is being fully complied with.”

Technology secretary Liz Kendall said: “Ofcom has my full support in robustly holding these services to account.

“No company should need a court order to act responsibly to protect children.

“They have been put on notice they will face the full force of the law if they fail to keep children safe.”

On Monday, a ban on social media for under-16s was rejected by MPs.

The age limit had been backed by peers earlier this year after growing calls from campaigners, including actor Hugh Grant.

However, a ban could still come in future after the Commons supported a Government bid to give additional powers to the Technology Secretary, who could potentially “restrict or ban children of certain ages from accessing social media services and chatbots”.

STV News is now on WhatsApp

Get all the latest news from around the country

Follow STV News
Follow STV News on WhatsApp

Scan the QR code on your mobile device for all the latest news from around the country

WhatsApp channel QR Code
Posted in

    Today's Top Stories

    Popular Videos

    Latest in UK & International

    Trending Now