UK Government to Amend Data Protection Bill Amid Backlash from Artists Over AI Use of Copyrighted Work

 
01/05/2025
6 min read

The UK government is preparing to revise its proposed Data Protection and Digital Information Bill in response to growing concerns from artists, authors, musicians, and other creatives. At the heart of the controversy is the use of their copyrighted content in training artificial intelligence (AI) systems—often without consent, compensation, or even knowledge.

As the digital economy continues to evolve post-Brexit, the UK is seeking to redefine its data protection laws independently of the EU's General Data Protection Regulation (GDPR). However, the proposed legislation has triggered alarm within the creative industries, particularly with regard to AI development practices.

Background: The Rise of AI and the Data Dilemma

The Data Protection and Digital Information Bill, currently under review in Parliament, is a wide-ranging piece of legislation designed to modernize the UK’s data laws, streamline compliance requirements for businesses, and foster innovation. One of its goals is to facilitate research and development in AI, a rapidly expanding field that is central to the UK’s broader tech ambitions.

AI systems—such as large language models and image generators—rely heavily on vast datasets to function effectively. These datasets are often compiled through a process known as text and data mining (TDM), which involves scanning and extracting data from publicly available content across the internet. For artists and creators, this means that their books, music, paintings, photographs, and more may be ingested into these systems without their explicit permission.

This practice has sparked a critical debate: how can the UK encourage innovation and leadership in AI while respecting and protecting the intellectual property (IP) rights of its creative community?

Artists Speak Out: “Our Work Is Not Free Fuel for AI”

Leading authors, musicians, visual artists, and industry groups have voiced strong objections to the bill’s current wording, arguing that it lacks adequate safeguards against mass scraping of copyrighted content. They say that the bill, if passed in its current form, would effectively legalize a form of digital exploitation.

Their core concerns include:

  • Lack of consent: Creative works are being used to train AI models without the creator’s knowledge or approval.
     
  • Absence of transparency: There is no public record or notice when specific works are used in training datasets.
     
  • No opt-out mechanism: Artists currently have little to no control over whether their content is excluded from AI training.
     
  • Commercial unfairness: AI companies, many of which are highly profitable, are benefiting financially from the uncompensated use of artistic labor.
     

Organizations such as the Society of Authors, Musicians’ Union, and the Music Managers Forum have been particularly outspoken. In a joint statement, they urged the UK government to ensure that the bill does not create a loophole under the pretense of promoting innovation.

“We welcome the development of AI, but it must not come at the expense of the rights and livelihoods of creators,” the Society of Authors said in a press release. “Our work is not free fuel for AI engines.”

Legal Gray Areas: Text and Data Mining Exemptions

Much of the concern centers around provisions in the bill that expand exemptions for text and data mining. While these practices are vital for academic and scientific research, their use in commercial AI development remains contentious.

Under the current draft, the bill could be interpreted to allow companies to mine any publicly accessible content without needing a license. This would effectively enable AI developers to train their systems on copyrighted books, artworks, and music without paying the original creators—a move critics say undermines the UK’s existing IP laws.

This legal gray area has prompted fears that the bill could set a dangerous precedent, weakening the protections artists have long relied on.

Government Response: Balancing Innovation with Creative Rights

In light of the backlash, the government has signaled that it is listening to concerns from the creative sector. Ministers have indicated that amendments to the bill are forthcoming, with the goal of striking a better balance between:

  • Fostering innovation and AI development, and
     
  • Upholding creators' rights, consent, and fair compensation.
     

A government spokesperson stated: “We remain committed to supporting the UK’s vibrant creative industries while ensuring that our legislative framework encourages safe, transparent, and responsible AI development. The forthcoming changes to the bill will make clear that the exemptions are not a carte blanche for AI developers.”

Consultations Ahead: Including Creators in the Conversation

To address the issue thoroughly, the UK government is expected to engage in further consultations with artists, rights holders, and industry representatives. These discussions aim to:

  • Define the limits of text and data mining exemptions.
     
  • Explore licensing frameworks that would allow creators to be fairly compensated.
     
  • Ensure greater transparency in how AI datasets are compiled and used.
     

This approach echoes similar debates taking place across the globe. In the United States, authors and visual artists have filed lawsuits against AI companies like OpenAI and Stability AI for copyright infringement. Meanwhile, the European Union has taken a more conservative approach in its AI Act, which includes specific provisions protecting intellectual property during AI training.

What Could Change: Possible Outcomes of the Bill’s Revision

Although the full scope of the changes remains under discussion, several likely outcomes have been identified by legal analysts and digital rights experts. These include:

Stricter Conditions for AI Training
The revised bill may introduce stricter criteria under which AI developers can access and use creative content. This could involve limiting data mining exemptions to non-commercial research or mandating more rigorous justifications for commercial use.
 

Mandatory Opt-Out Mechanisms
One of the most sought-after changes is the creation of a simple, enforceable opt-out mechanism, allowing creators to register their works and prevent them from being used in AI training.
 

Licensing and Compensation Models
The government may consider frameworks where AI companies must obtain licenses to use copyrighted material, similar to how streaming platforms pay royalties to musicians and filmmakers.
 

Greater Transparency Requirements
New transparency rules may require AI developers to disclose the sources of their training data, giving creators more visibility into how and where their content is being used.
 

Penalties for Unauthorized Use
The bill may also introduce or strengthen penalties for AI companies that misuse copyrighted content or fail to comply with opt-out and licensing requirements.
 

Broader Implications: The Future of AI Ethics and Creativity

The outcome of this legislative process will have significant ramifications not just for the UK, but globally. As one of the world’s major creative and technological hubs, the UK’s approach could influence other countries grappling with similar challenges.

For AI developers, the proposed amendments may introduce new compliance hurdles—but they also present an opportunity to build trust with the public and the creative sector. Ethical and transparent use of data is increasingly viewed as essential to the long-term success of AI technologies.

For creators, the stakes are even higher. The ability to control how their work is used in the digital age is not just a legal issue—it’s a matter of livelihood, artistic integrity, and cultural preservation.

Conclusion: Toward a Fairer Digital Future

As the UK government works to finalize its Data Protection and Digital Information Bill, it faces a complex balancing act. On one hand, there is a clear need to support innovation and position the UK as a leader in AI development. On the other, there is a moral and economic imperative to protect the rights of creators whose work underpins much of the content AI systems rely on.

The coming months will be crucial as lawmakers, creatives, and tech companies continue to negotiate the future of data rights in the age of artificial intelligence. One thing is clear: a digital economy that thrives on creativity must also safeguard the creators who make it possible.

Need Independent Legal Advice Fast?


Our friendly solicitors offer remote ILA appointments at a time that suits you — with same-day service available. Book online or call us now for expert, hassle-free legal advice you can trust.

Book Your ILA Appointment Today