Do we own AI outputs?

When your teams feed brand data into generative-AI tools, who owns what comes out? It’s a live question across boardrooms, as well as a growing source of legal uncertainty.

The reassuring answer? You usually do. But only if your inputs are clean, and your contracts are watertight.

When shortcuts turn into lawsuits

Recent legal battles show how fast unclear ownership can flip into risk:

  • Getty Images v Stability AI: Getty dropped its headline copyright claims in 2025 after failing to prove what data trained Stable Diffusion. But its trademark and secondary infringement cases continue—raising real concerns for brands using AI-generated images.

  • Voice-actor cloning: In July 2025, a New York court allowed a claim against voice-tech company Lovo for using samples sourced from Fiverr. The judge confirmed that personality rights still apply—even in synthetic media.

  • Thaler v Perlmutter: In a March 2025 ruling, the DC Circuit confirmed that purely AI-generated works can’t be copyrighted in the US. If no human made a creative decision, it belongs to the public domain.

Each case hinges on one issue: control. Who supplied the data? Who made the edits? And was there a human in the loop?

Rulebooks are catching up

Legal frameworks are evolving fast:

  • The EU AI Act (from August 2025) requires model providers to publish public summaries of their training data. Non-compliance can trigger fines of up to 3% of global turnover.

  • The US Copyright Office (April 2025) states that outputs are only protected when a human makes “discernible creative choices”. Autonomously generated content? No copyright.

The takeaway: human involvement is more than ethical, it’s legal cover.

The playbook for protecting your IP

Handled well, generative AI can scale your creative output without risking your rights. But it only works if you treat IP like the asset it is.

Four things every board should demand:

  • Scrub the inputs: No customer data. No trade secrets. No third-party content unless cleared.

  • Contract for ownership: Define background, foreground and joint IP. Agree who owns what – before the work begins.

  • Co-create, don’t delegate: Use AI to draft, suggest or remix – but make sure human authorship is obvious and traceable.

  • Audit and adapt: Review your usage, run regular IP checks, and update terms as models and risks evolve.

IP isn’t just a legal issue. It’s a leadership one. If AI is now part of your creative process, protecting your outputs starts with how you govern your inputs.

Next
Next

Is ChatGPT secure? A balanced briefing for leaders