Locations
As AI technology advances, its impact on global copyright — with major litigation and potential new regulation on the cards — will ramp up this year, write Charlotte Fleetwood-Smith and Rebecca Pakenham-Walsh of Fieldfisher.
A version of this article first appeared in World Intellectual Property Review on 2 January 2025.
Artificial Intelligence (AI), especially the generative AI boom, has significantly impacted the copyright world in recent years. In 2025, major litigation, potential new government regulation and technological advancements will add to the global copyright landscape.
Litigation
Case law will continue to dominate 2025 copyright headlines as major cases progress towards trial and as uncertainty remains about how existing laws apply.
Everybody will be watching the UK High Court action in Getty Images v Stability AI. Getty Images claims that Stability AI has infringed intellectual property rights including copyright in content owned or represented by Getty Images. Its position is that Stability AI has unlawfully copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images without a licence.
A decision in Getty Images v Stability AI has the potential to significantly shape the AI copyright litigation landscape. It will provide important guidance on how business models like those of Stability AI are treated under UK copyright law, which will also be of substantial interest to other jurisdictions. It may also have a bearing on the trajectory of similar litigation in the pipeline, and where the correct balance between rightsholders and AI developers must be struck.
In the US, there is a swell of litigation in other cases such as The New York Times v OpenAI and Microsoft working their way through the courts. While we are unlikely to see these cases go to trial in 2025, US litigation will be keenly watched by content owners and AI developers alike, given the relevance it has to their businesses. Key issues, such as whether use of content for AI training models can be deemed ‘fair use’ will be under the microscope over the coming year.
Elsewhere in the world, we are likely to see further litigation on the AI front, if recent lawsuits against OpenAI in Germany and Canada are anything to go by. The use of artists’ voices and likenesses, and the complexities arising with deepfakes and personality rights, are also emerging areas for dispute.
Regulatory balancing acts
Regulation of AI and/or legislation is and will continue to feature high on the agenda in many jurisdictions in 2025, with a focus on striking a balance between the interests of rights holders while allowing AI innovation.
On December 16, 2024, the UK government launched its highly anticipated consultation on copyright and AI, seeking views on how it can ensure the UK’s legal framework supports the UK creative industries and setting out proposals to give the creative industries and AI developers clarity on the law.
Most interestingly, proposals for a new text and data mining (TDM) copyright exception, scrapped in February 2023, are now back on the table. A broader TDM exception (akin to the Digital Copyright Directive, which the UK did not implement) for AI training for commercial purposes is proposed while allowing rights holders to reserve their rights to control content use.
The existing UK TDM exception at section 29A of the Copyright Designs and Patents Act 1988 only applies for research for non-commercial purposes, and lawful access, so is limited.
The consultation addresses other problematic areas such as transparency obligations for AI developers in relation to training data for AI models, AI content labelling, and computer-generated works. This will make it easier to strike licensing deals and issues around personality rights in the context of digital replicas, such as deepfake imitations of individuals, and will seek views on whether the current legal frameworks are sufficiently robust to tackle the issues.
The consultation also signals that this long overdue change is on the horizon, which is much needed next year if the UK wants to fulfil its ambitions to be a global AI leader, while respecting its creative industries.
In addition, there have been calls to revisit whether the Data (Use and Access) Bill’s (Data Bill) expansions on data scraping for “scientific research” purposes adequately align with reasonable copyright protection. Some harmonisation between the Data Bill and the TDM exception favoured in the consultation might be warranted to ensure rights are adequately recognised.
Don't miss a thing, subscribe today!
Stay up to date by subscribing to the latest Intellectual Property insights from the experts at Fieldfisher.
Subscribe nowEU AI Act provisions
Further activity to the new EU AI Act is expected in 2025 (in force since August 1, 2024 but with AI/copyright provisions coming into effect in August 2025). Providers of general-purpose AI models (as defined by the Act, e.g. ChatGPT) must comply with Article 53(1), requiring them to introduce policies that adhere to EU copyright laws and to avoid engaging in TDM where businesses have reserved their rights through an opt-out mechanism.
Providers must also publicly disclose detailed content summaries used for AI training (a template will be issued in time). Critically, the EU AI Act has some extraterritorial reach, meaning AI entities outside the EU will need to check if they are captured, potentially allowing an EU standard by proxy. Obligations to summarise training content could also identify previously unknown uses of copyright, leading to more infringement actions for those caught by the Act.
The US continues to assess its approach, though a range of bills promoting transparency around the use of copyright works used to train AI have launched in 2024, seeking to address issues experienced by rights owners.
Evolving AI technology and ownership
Advancements in AI technology will continue in 2025, testing emerging regulation/legislation and raising more questions about ownership in AI-generated works.
Ownership will really depend on the exact circumstances surrounding creation of particular AI-generated works. Some cases will be clearer cut than others. In the UK, if a literary, dramatic, musical or artistic work is created with the assistance of an AI tool but involves an element of human creativity (giving prompts or programming of Gen-AI etc), then it will be easier to assign a human author and go on to apply the usual originality principles to the AI-generated work. We can expect to see further wranglings in 2025 over ownership of AI-generated works.
Cases in the Czech Republic and the US highlight difficulties in this area, posing more questions about what level of human input is required to meet human authorship requirements. The new UK AI copyright consultation also seeks to re-evaluate this point.
Final word
There is no doubt that 2025 is set to be another blockbuster year in the AI and copyright arena. Both litigation and regulatory efforts will hopefully help the UK to finesse its position in ongoing efforts to be a global AI superpower, with worldwide developments continuing to provide important guidance on evolving AI technologies. It will be a very important year for the UK government, which must ensure a balanced approach — recognising the importance of the creative economy while building on the strengths of the AI sector so both can flourish.
Charlotte Fleetwood-Smith is a senior associate at Fieldfisher. She can be contacted at: charlotte.fleetwood-smith@fieldfisher.com
Rebecca Pakenham-Walsh is a senior associate (professional support lawyer) at Fieldfisher. She can be contacted at: rebecca.pakenham-walsh@fieldfisher.com