UK government consults on AI and copyright laws to boost creative industries and innovation

The UK government is consulting on copyright reforms to balance AI innovation with creators’ rights. Running until February 2025, proposals include opt-in licenses for creators, a data mining exception, and a balanced middle-ground approach. With £124.8 billion from creative industries and AI’s growth at stake, the UK aims to clarify rules, boost trust, and keep its global edge. The question remains: can creators and AI developers finally find common ground?

The UK government has launched a significant consultation aimed at clarifying copyright laws for AI developers and the creative industries. This move is designed to foster innovation and growth, ensuring that both sectors can thrive in the digital age. The consultation, which runs until 25 February 2025, seeks feedback on potential changes to UK copyright legislation in light of AI advancements.

Currently, uncertainty about how copyright law applies to AI is a major barrier to progress. Creators often struggle to control or seek payment for the use of their work, while AI firms face legal risks that stifle investment and innovation. This ambiguity has led to a stalemate, with previous attempts to establish voluntary AI copyright codes proving unsuccessful.
The consultation proposes four potential options for addressing these challenges:
Do Nothing: This option would leave UK copyright laws as they are, deferring the matter to the courts to resolve on a piecemeal basis. However, this is not favored by the government, as it would prolong the current legal uncertainty.
Opt-In Model: This would strengthen copyright protection for rights holders by requiring an express license for AI models to be trained on copyrighted works. While popular with rights holders, it could hinder the government’s goal of boosting the AI economy
Broad Data Mining Exception: This approach, similar to Singapore’s model, would allow data mining on copyrighted works for AI training without the rights holder’s permission. This option is likely to be popular with AI developers but is less favored by rights holders
Balanced Approach: This middle-ground option would allow AI developers to train models using material to which they have lawful access, provided that rights holders have not expressly reserved their rights. This would be subject to robust transparency measures, requiring developers to be transparent about the material used to train their models.
The consultation outlines three key objectives:

Empowering Creators: Giving right holders more control over their content and ensuring they can license—and be paid for—its use in AI training.

Supporting AI Developers: Providing lawful access to high-quality data so the UK remains a hub for cutting-edge AI development.

Building Trust: Increasing transparency between creators and developers to foster collaboration rather than conflict.

To achieve these goals, the government proposes a dual approach:

A Reservation Mechanism: Creators could opt out of having their work used for AI training, allowing them to negotiate licensing deals instead.

An Exception for Text and Data Mining: If creators don’t reserve their rights, AI developers could use their material under certain conditions, provided there’s transparency about what’s being used and how.

This balancing act aims to protect intellectual property while enabling innovation—a delicate tightrope walk that requires input from both sides.

Why Does This Matter?

The UK’s creative industries are a powerhouse, contributing £124.8 billion annually to the economy and employing thousands. They’re not just about entertainment; they’re a cornerstone of British identity and global influence.

Meanwhile, AI is reshaping industries worldwide, with the potential to boost productivity by up to 1.5 percentage points annually , according to the International Monetary Fund. Both sectors are vital to the UK’s economic future—but they’re currently at odds.

At the heart of the issue is how AI models are trained. These systems rely on vast datasets, often scraped from the internet, which may include copyrighted material. For creators, this raises alarming questions: Are their works being used without permission? Are they being compensated fairly? For AI developers, the uncertainty over what’s legal stifles innovation and drives investment overseas.

“This status quo cannot continue,” the government warns in its consultation document. “It risks limiting investment, innovation, and growth in the AI sector, and in the wider economy.”

One recurring theme in the consultation is the need for transparency. Right now, many creators feel left in the dark about how their work is being used. “There’s a lack of clarity from AI developers about what content is or has been used and how it is acquired,” the document notes. This opacity undermines trust and makes enforcement difficult.

Dr. E. C an expert in intellectual property law, explains: “Transparency isn’t just a nice-to-have; it’s essential. When creators know exactly how their work is being used, they’re more likely to engage constructively with AI developers.”

While the proposals sound promising, delivering them won’t be easy. Technical solutions are needed to make it simple for creators to reserve their rights and for developers to verify compliance. Existing tools have limitations, and adoption rates vary widely.

Moreover, international cooperation will be crucial. Since copyright law applies where copying occurs, AI developers often train models abroad to avoid ambiguity. To keep the UK competitive, the government must align its framework with global standards—a task easier said than done.

The divide between creators and developers isn’t just theoretical. Recent controversies highlight the stakes. For instance, authors have expressed outrage over AI-generated books mimicking their styles without consent. Meanwhile, AI companies argue that overly restrictive laws could stifle innovation.

Sarah Thompson, a novelist and advocate for creators’ rights, says, “If my work is used to train an AI that writes novels, I should have a say—and I should be paid. It’s as simple as that.”

On the other hand, James Patel, CEO of a London-based AI startup, counters: “We’re not stealing anyone’s work. We’re building tools that benefit society. But the current legal gray area makes it hard to operate here.”

The consultation also touches on broader issues, such as:

Computer-Generated Works: Should purely AI-created content receive copyright protection? And if so, who owns it?

Digital Replicas: With deepfake technology advancing, how can individuals control the use of their likeness?

Synthetic Data: As AI models increasingly rely on artificially generated datasets, what impact will this have on the ecosystem?

These questions underscore the complexity of regulating AI. They also highlight the urgency of finding answers before problems escalate.

Fabrice Iranzi

Journalist and Project Leader at LionHerald, strong passion in tech and new ideas, serving Digital Company Builders in UK and beyond
E-mail: iranzi@lionherald.com

Leave a Reply

Your email address will not be published.