Getty vs. Stability AI: What the UK ruling means in 2025
Gary Whittaker
Getty vs. Stability AI: What the UK ruling means for creators in the UK, Canada, and the U.S.
Published: November 2025 · JackRighteous.com
Quick summary
- On November 4, 2025, the UK High Court largely rejected Getty’s core copyright claims against Stability AI. The judge also found a limited trademark issue where Getty-style watermarks appeared in some AI outputs.
- The decision turned on jurisdiction (insufficient proof that training happened in the UK) and specific UK statutory questions. It is important but narrow.
- The ruling does not set binding precedent in the United States or Canada. Both remain unsettled on AI training and copyright.
What this UK case was about
Getty Images sued Stability AI (maker of Stable Diffusion) in the UK, alleging the company used millions of Getty photos without permission to train an image model. The High Court’s public judgment was issued on November 4, 2025.
The decision, in plain terms
1) The main copyright claim did not succeed. The court said there wasn’t enough evidence that model training happened in the UK, which was essential to that claim under UK law.
2) Watermark/trademark issue: The judge found trademark infringement where outputs included a Getty-style watermark. This was limited to those facts and does not amount to a blanket approval of all AI training.
3) What the court did not say: It did not declare that all AI training on copyrighted images is lawful. It did not set a global rule. The judgment focused on UK law and the case record.
Why people call it “landmark”
It is one of the first detailed UK judgments on a modern AI image model. It shows how current law handles training data and model outputs—at least in the UK, on these facts. But it leaves bigger questions to future cases or legislation.
What this means outside the UK
United States
- U.S. courts have separate, ongoing lawsuits about AI training and “fair use.” No single, final rule exists yet.
- The U.S. Copyright Office (May 2025) noted that copying for training can implicate rightsholders; whether it qualifies as fair use depends on the specific facts.
Canada
- Canada has no court ruling that squarely answers whether AI training on copyrighted works is “fair dealing” or requires licenses. The framework is still being discussed.
- Canada’s “fair dealing” is narrower than U.S. “fair use,” so assumptions that training is automatically allowed are risky. Policy work is ongoing.
Practical steps for creators (music, visuals, games, branding)
Use models and tools with transparency. Prefer providers that explain training sources and offer terms suitable for commercial use. Keep a simple log of prompts, model versions, edits, and human input. This supports platform disclosures and helps answer questions later.
Avoid publishing outputs with watermarks or brand marks. If a generated image shows a logo or watermark, do not use it commercially.
License where it matters. For key identity assets (album art, game art, logos, product graphics), consider licensed stock, commissioned work, or vendors who can document their training sources.
Bottom line
The UK ruling is an important data point, not a global green light. U.S. and Canadian rules are still developing. If you publish commercially, favor transparent tools, avoid watermarks, and keep basic records of your creative steps.
Sources
- UK High Court judgment (PDF): Getty Images v Stability AI
- Judiciary case page: Getty Images v Stability AI
- Reuters (Nov 4, 2025): reporting on the ruling
- The Guardian (Nov 2025): case coverage
- The Verge (Nov 2025): analysis and implications
- U.S. Copyright Office (May 2025): Generative AI policy report hub
- Canadian government/Parliamentary materials and legal commentary on AI and copyright
This article is informational only and not legal advice.
Build smarter with AI (ethically)
Join The Righteous Beat for creator-safe workflows, templates, and case studies. Explore our GET JACKED toolkits for AI music, branding, and launch systems.
