Liz Kendall urges Ofcom to act faster on online safety as delays raise concerns

Are social platforms and the systems that govern them doing enough to make the internet safe for all?
Liz Kendall

Britain’s technology minister Liz Kendall has called on the country’s communications regulator, Ofcom, to move faster in enforcing new online safety laws designed to protect children and vulnerable users from harm.

In a letter to the regulator, Kendall said she was “deeply concerned” that delays in implementing key parts of the Online Safety Act could slow progress in tackling harmful content online, particularly abuse directed at women and girls, and rising antisemitism.

“Delays in implementing duties such as user empowerment could hinder our work to protect women and girls from harmful content and protect users from antisemitism,” Kendall wrote.

 The Online Safety Act, a landmark piece of legislation passed in 2023 after years of debate aims to hold social media platforms and search engines accountable for what happens on their sites, requiring them to remove illegal content and protect children from harmful material.

But more than two years after its passage, the law is still being phased in. Ofcom says it plans to publish a register of companies required to verify users and meet additional safety duties by July 2025, a year later than originally promised. The regulator attributed the delay to legal challenges and what it described as “complex issues” arising from the law’s implementation.

One major complication came from the Wikimedia Foundation, which operates Wikipedia. Earlier this year, its challenge against parts of the Act was dismissed by London’s High Court, but the judge left open the door for a future appeal depending on how Ofcom categorises the site under the new rules.

If Wikipedia is labelled a “Category One” service, the same tier as social media giants like Facebook or TikTok, it would face stricter requirements, including user verification and detailed transparency reporting.

The Online Safety Act has divided opinion since its inception. Supporters argue it represents a long-overdue step in forcing tech companies to take greater responsibility for the content they host.

Critics, including free speech advocates and some U.S. technology firms, warn it could lead to overreach and the censorship of lawful content.

At its core, the Act requires online platforms to build systems that reduce the risks of illegal activity, such as terrorism, child sexual abuse, fraud, or hate crimes,  and to take down such content quickly when found. It also demands stronger safeguards for children, including mandatory age checks on pornography sites and tools to block access to material encouraging self-harm, eating disorders, or suicide.

For adults, the Act promises more control. The largest social media companies, classed as Category One services, will need to offer users optional tools to filter out legal but harmful material, such as racist, misogynistic, or antisemitic content.

Kendall’s frustration reflects growing concern that bureaucratic delays could blunt the law’s impact. Many of the Act’s most significant provisions, including those requiring transparency reports and enforcement powers against senior tech executives, depend on Ofcom completing its guidance and categorisation work.

According to the government’s own timeline, platforms have already been given deadlines to assess risks related to illegal content and children’s safety. The regulator now has powers to fine companies up to £18 million or ten percent of their global revenue, whichever is greater, for failing to comply.

In extreme cases, Ofcom can even block offending sites from operating in the UK.

Still, until the final codes of practice are in place, enforcement remains limited. That’s why ministers are pushing for speed, especially given how fast online harms evolve.

As Kendall’s department warns, the risk is that outdated systems can’t keep up with the realities of algorithm-driven platforms that amplify harmful content at scale.

Leave a Reply

Your email address will not be published.