UK Launches Review On AI Use Of Creative Content

The UK government released a consultation to decide how AI developers can legally use copyrighted content, such as music, writing, and visual art, to train their models. It was created to create clarity for both creators and AI companies while doing something about the conflicts surrounding ownership and compensation.

The proposals are set on giving creators more control over how their work is used. AI firms would also need to be more transparent about the data and sources used in their training systems.

The government is also looking into how digital replicas, such as deepfakes, are managed under current laws to prevent misuse of individuals’ voices, likenesses, or identities.

 

Why Are Copyright Laws Being Reconsidered?

 

Current laws have left many creators unable to track how their work is being used. With AI models taking so much data from creative industries, some musicians, artists, and writers feel left behind without fair payment or permission systems.

On the other side, AI firms face confusion about which content they can legally use, which has slowed growth and added legal risks. Without a clear framework, its not as clear what the boundaries are on both ends.

Earlier voluntary discussions between AI developers and creative industries failed to produce a solution. As a result, the government believes stronger measures are necessary to deal with these issues.

 

What Solutions Are On The Table?

 

The government is proposing changes to copyright law, with a possible exception for AI developers to use copyrighted materials for training models. With this, rights holders would have the option to “reserve their rights,” which would allow them to restrict usage or negotiate licensing deals.

On transparency, AI developers would need to publish information about which datasets they have used and how they were obtained. This would give creators a better understanding of where their work appears and how it contributes to AI training.

The proposals also encourage licensing agreements, which would allow creators to receive payment for their work while AI firms gain legal certainty over its use. The government plans to involve both sides to agree on fair standards.
 

 

What Do Creative Industries Think?

 

Many groups representing the creative industries, such as UK Music and the BPI, have pushed back on the proposed exception. They argue that it could weaken creators’ ability to protect their work and secure fair payments.

The Creative Rights in AI Coalition has launched a campaign calling for AI developers to get explicit permission before using copyrighted material. Backed by organisations such as PRS for Music and the Independent Society of Musicians, the coalition is also asking for stronger accountability rules to prevent misuse.

The Council of Music Makers has reinforced this. They believe that creators must always give consent before their work, lyrics, or likeness is used in AI systems. Without this, they argue, creators lose value while AI firms profit.

Critics of the proposals believe that an exception to copyright law could discourage AI companies from negotiating proper licensing deals. They feel that fair protections and consent systems must remain at the centre of any legal changes.

Lisa Nandy, Secretary of State for Culture, Media and Sport, said, “This government firmly believes that our musicians, writers, artists and other creatives should have the ability to know and control how their content is used by AI firms and be able to seek licensing deals and fair payment.

“Achieving this, and ensuring legal certainty, will help our creative and AI sectors grow and innovate together in partnership. We stand steadfast behind our world-class creative and media industries which add so much to our cultural and economic life.

“We will work with them and the AI sector to develop this clearer copyright system for the digital age and ensure that any system is workable and easy-to-use for businesses of all sizes.”