Britain’s Information Commissioner’s Office (ICO) has launched a formal investigation into TikTok, Reddit, and Imgur over their handling of children’s personal data.
The probe seeks to determine whether these platforms comply with the UK’s stringent data protection laws, particularly in how they use algorithms to recommend content to minors aged 13–17 and verify the ages of underage users.
The ICO’s inquiry into TikTok focuses on its use of personal information from teenage users to tailor content recommendations, while investigations into Reddit and Imgur center on their age-assurance mechanisms.
Social media platforms rely on sophisticated algorithms to prioritize content and maintain user engagement. However, these systems often amplify similar material, raising concerns that children could be exposed to increasingly harmful or inappropriate content.
For instance, TikTok’s recommendation engine uses behavioral data to curate videos for users, potentially influencing younger audiences with content that may not align with their developmental needs.
Reddit and Imgur face scrutiny over their ability to accurately assess the age of users, a critical measure to ensure compliance with legal age restrictions.
“The responsibility to keep children safe online lies firmly at the door of the companies offering these services and my office is steadfast in its commitment to hold them to account. ” said John Edwards, the UK Information Commissioner.
“If social media and video-sharing platforms want to benefit from operating in the UK, they must comply with data protection law.”
This is not TikTok’s first encounter with UK regulators. In 2023, the platform was fined £12.7 million ($16 million) for breaching data protection laws by processing the personal data of children under 13 without parental consent.
In response to the investigation, Reddit emphasized its commitment to compliance. A spokesperson told news agency Reuters that while most of its users are adults, the company plans to roll out updates this year to address new UK regulations around age verification.
ByteDance, TikTok’s parent company, and Imgur have yet to comment publicly on the matter.
The investigations come amid a global reckoning over how digital platforms impact young users. In 2021, the ICO introduced the Children’s Code, which mandates that online services likely to be accessed by children must prioritize their privacy and safety.
Since its implementation, several platforms, including X (formerly Twitter), Sendit, and BeReal, have made significant changes to their practices, such as disabling geolocation tracking for minors and turning off personalized advertising.
Despite these advancements, critics argue that enforcement remains inconsistent. “While the Children’s Code has driven meaningful improvements, there’s still a long way to go, platforms need to demonstrate a genuine commitment to protecting young users rather than treating compliance as a box-ticking exercise.” commented a facebook user reacting to the announcement.
There is a trend of governments worldwide tightening regulations on tech giants. The European Union’s Digital Services Act (DSA) and California’s Age-Appropriate Design Code Act are among recent legislative efforts aimed at curbing risks posed by online platforms to children.
These measures signal a shift toward holding companies accountable not only for what content they host but also for how their algorithms shape user experiences.
For the UK, the stakes are high. As one of the first countries to implement comprehensive child-focused data protection standards, Britain’s approach could serve as a model for other nations grappling with similar challenges. However, balancing innovation with regulation remains a delicate task.
Looking ahead, the outcome of these investigations could set a precedent for future enforcement actions. If violations are found, the ICO has the authority to impose penalties, issue reprimands, or demand operational changes. Meanwhile, the watchdog has pledged to collaborate closely with Ofcom, the UK’s communications regulator, to ensure a coordinated approach to online safety.
Meanwhile Ofcom, Britain’s media regulator, has mandated that social media and online platforms submit a risk assessment by March 31, detailing the likelihood of users encountering illegal content on their sites.
This requirement stems from legislation enacted last year under the Online Safety Act, which obliges companies such as Meta’s Facebook and Instagram, as well as ByteDance’s TikTok, to proactively address criminal activity and enhance user safety.
The law demands that these platforms evaluate and mitigate risks associated with a broad spectrum of offenses, including terrorism, hate crimes, child sexual exploitation, and financial fraud.