The UK’s internet watchdog, Ofcom, has launched an enforcement program under the Online Safety Act (OSA) to assess how file-sharing and file-storage services are preventing the dissemination of child sexual abuse material (CSAM).
Citing evidence that these platforms are particularly vulnerable to such misuse, Ofcom has written to several service providers, putting them on notice and preparing to issue formal information requests.
These companies must conduct Illegal Content Risk Assessments and implement measures to identify and remove CSAM, with failure to comply potentially resulting in penalties of up to 10% of their global annual turnover.
The enforcement effort includes collaboration with law enforcement agencies and organizations like the Internet Watch Foundation and the National Centre for Missing and Exploited Children.
To meet OSA requirements, Ofcom’s Codes of Practice recommend providers adopt automated moderation technologies such as perceptual hash-matching, which can identify similar rather than just identical images of known CSAM.
Ofcom has already engaged with high-risk services, including smaller platforms, to evaluate their compliance efforts. Moving forward, the watchdog will continue monitoring and engaging with these services, determining whether formal enforcement actions are necessary.
The overarching goal is to ensure that file-sharing and file-storage platforms implement effective systems to minimize the presence of CSAM and swiftly remove any illegal content when detected.
Here’s what’s happening, why it matters, and what it could mean for the digital landscape.
Let’s start with the basics. Ofcom, the UK’s communications watchdog, has launched a new initiative to make sure file-sharing and storage services, think platforms where you can upload, store, and send files, are doing their bit to stop CSAM from circulating.
This comes as the OSA, a landmark law passed in 2023 to clean up the internet, starts enforcing its “illegal content duties.” From today, companies running these services have to take serious steps to keep users safe, especially kids, or face hefty fines.
Why focus on file-sharing sites? Ofcom’s research shows these platforms are “particularly susceptible” to being misused by people sharing CSAM, images or videos that exploit and harm children in the worst ways imaginable.
Unlike social media, where posts are often public and moderated, file-sharing services can be quieter corners of the web, letting illegal content slip through the cracks.
So, Ofcom’s plan is to shine a spotlight here, starting with letters to a bunch of these companies (they’re keeping names under wraps for now) to warn them: formal requests for info are coming soon.
They’ll have to show what they’re doing or plan to do to stop CSAM and hand over risk assessments proving they’ve thought this through.
Why this matters: A closer look at the stakes
This isn’t just bureaucratic box-ticking, it’s about protecting kids from real harm. CSAM isn’t a small problem; it’s a global crisis.
The Internet Watch Foundation (IWF), a UK charity that hunts down this stuff, found over 275,000 webpages containing CSAM in 2022 alone. And that’s just what they caught.
File-sharing services, with their ease of use and often lax oversight, can be a goldmine for offenders.
A 2021 report from ActiveFence, a tech safety firm, flagged these platforms as hotspots for not just CSAM but also terrorist content and hate speech, noting how their anonymity and accessibility make them prime targets for abuse.
The stakes are high for companies too. If they don’t comply with the OSA, they could be slapped with fines up to 10% of their global annual turnover.
For a big player like Dropbox or Google Drive, that’s potentially billions.
So, how do you stop something this awful at scale? Ofcom’s got a clever tech trick up its sleeve: perceptual hash-matching. Sounds jargony, right?
Here’s the simple version, it’s like giving the internet a fingerprint scanner. This tech creates a unique “hash” or digital signature for known CSAM images.
When someone uploads a file, the system checks it against a database of these hashes. If it’s a match—or even close enough, it gets flagged and yanked down fast.
What’s cool about perceptual hash-matching (as opposed to the older “cryptographic” kind) is that it can spot images that aren’t identical but still similar, like if someone tweaks a photo to dodge detection.
Ofcom’s codes of practice, rolled out in February 2025, say all file-sharing services should use this if their risk assessments show they’re at high risk for CSAM.
It’s not optional for the big guys, and even smaller platforms can’t ignore it if the data points their way.
This isn’t sci-fi, it’s already out there. Microsoft’s PhotoDNA, a version of this tech, has been helping companies like Facebook and Twitter catch CSAM for years.
A 2023 Tech Coalition survey found 89% of its member companies use some form of image hash-matching, pulling down millions of illegal files annually. Ofcom’s betting on this to level up safety across the board.
This enforcement push is just the start. Ofcom’s already been chatting with smaller, riskier platforms and big names alike, plus teaming up with law enforcement and groups like Canada’s C3P to pinpoint the worst offenders.
They’ll keep digging, assessing responses, sniffing out non-compliance, and maybe dropping the hammer with fines if companies don’t shape up.
The ripple effects could be huge. The OSA applies worldwide, if your service touches UK users, you’re on the hook. That’s got global firms scrambling, and it might nudge other countries to tighten their own rules. The EU’s Digital Services Act, for instance, has similar vibes, and the US is watching closely.
For us regular folks, it’s a mixed bag. A safer internet for kids? Absolutely worth it. But will it mean more hoops to jump through when we share files or worse, more eyes on our data? Time will tell.
For now, Ofcom’s betting that tech and tough rules can outpace the creeps. As officials are putting it, “We’re not waiting for harm to happen, we’re acting now.” Whether that’s a promise or a warning depends on how this all shakes out.