undefined
Skip to main content
Netherlands|en-NL

Add a bookmark to get started

Global Site
Africa
MoroccoEnglish
South AfricaEnglish
Asia Pacific
AustraliaEnglish
Hong Kong SAR ChinaEnglish简体中文
KoreaEnglish
New ZealandEnglish
SingaporeEnglish
ThailandEnglish
Europe
BelgiumEnglish
Czech RepublicEnglish
HungaryEnglish
IrelandEnglish
LuxembourgEnglish
NetherlandsEnglish
PolandEnglish
PortugalEnglish
RomaniaEnglish
Slovak RepublicEnglish
United KingdomEnglish
Middle East
BahrainEnglish
QatarEnglish
North America
Puerto RicoEnglish
United StatesEnglish
OtherForMigration
17 February 20255 minute read

AI and authorship: Navigating copyright in the age of generative AI

In a previous blog post, we explored the principle under EU law that human intervention is a prerequisite for granting copyright protection to works created using generative AI tools.

The extent to which works qualify for copyright protection depends on whether they constitute a "specific and concrete form" and reflect the author's "own intellectual creation," resulting from their free and creative choices.

While copyright protection for AI-generated works has not been ruled out, legal experts are eagerly awaiting case law to clarify how courts will interpret these requirements in practice.

 

Who can claim authorship of AI-generated works?

If AI-generated works are deemed eligible for copyright protection, the next critical question is: who can claim authorship?

Users of generative AI tools: The prevailing view is that copyright, if granted, would most likely belong to the natural person who actively uses the AI tool in their creative process. If a user provides highly detailed and specific instructions, input and prompts, effectively shaping the AI-generated output to reflect their intended creative vision, they could be recognised as the author. The key factor is the degree of human involvement in directing and refining the output.

Authors of pre-existing copyrighted works: If an AI tool relies on pre-existing copyrighted material (whether uploaded by the user or scraped from publicly available sources), the original rights holder of such material may have a claim over the resulting output. However, most widely used generative AI platforms, such as ChatGPT and Microsoft Copilot, explicitly prohibit users from inputting third-party copyrighted content without permission, complicating such claims.

Developers and owners of AI tools: The legal consensus suggests that AI tool developers are not the authors of the outputs generated by their software. While they may hold intellectual property rights in the software itself, they do not exercise sufficient creative control over individual outputs to qualify as authors. Similarly, the mere ownership of an AI tool does not confer rights to its output, regardless of any possible contrary provisions in user agreements (cf. infra). To enforce copyright in court, a claimant has to prove the work is protectable under copyright law, which would be problematic for an AI tool provider attempting to claim ownership over all user-generated content.

 

Challenges in copyright enforcement and burden of proof

A more complex issue arises in enforcement: how does a claimant prove authorship of an AI-generated work? When copyright ownership is disputed (after it has been established that the work itself is original), the claimant must be able to prove they actually own the copyright in the work, either because they are the natural person who created the work or because they have acquired the rights from the original author.

Under Belgian law, for example, this burden of proof is considerably reduced by Article XI.170(2) of the Code of Economic Law (CEL), which provides that authorship is presumed to belong to the person whose name or pseudonym appears on the work, unless the contrary is proved.

While this presumption of authorship has rarely been litigated, disputes may increase as AI-generated works proliferate. Courts are likely to grapple with the challenge of distinguishing between human-created and AI-created works. After all, the presumption of Article XI.170(2) CEL applies only "until proven otherwise." Although the application of this article opens the door to abuse, it seems difficult for courts to increase the claimant's burden of proof by requiring them to prove that they actually created the work and that the work was not created by or with the help of AI, implying an impossible, negative proof that would be contrary to Article XI.170(2) CEL.

That being said, the purported author of a work created by AI may be in violation of the AI tool's terms of use; for example, ChatGPT's terms of use explicitly prohibit the user from "representing that the output is human-generated when it is not."

Defendants facing infringement claims may also find it difficult to prove that a work was AI-generated rather than created by the claimant. Since AI tools can produce varied outputs even from identical prompts, it will be difficult, without direct access to the prompts used, to prove that a particular output was the result of AI rather than human creativity.

Further complicating matters, AI platforms disclaim responsibility for the uniqueness of outputs. For example, OpenAI’s ChatGPT terms explicitly acknowledge that multiple users might receive identical or similar results, while Microsoft's Copilot terms contain similar provisions. This raises questions for users seeking to claim rights over AI-generated works, particularly in commercial contexts.

The user should also be aware that any creations the user uploads to the tool may be used by the tool to generate other content. For example, the Microsoft Copilot terms of use state that Microsoft does not claim ownership of the content submitted by the user, but that by using the services, the user grants Microsoft and its affiliated companies and third-party partners permission to use the input in connection with the operation of their businesses.

Users should be aware that, by uploading original, self-created content to an AI tool, the content may be reused in outputs provided to other users.

 

Best practices for AI-assisted creators

In light of these challenges, creators using AI should adopt best practices to substantiate their claims to authorship:

  • Document the creative process: keep records of input materials, including rough drafts, sketches, or textual concepts that predate AI-generated outputs.
  • Preserve prompts and iterations: keep detailed records of the prompts entered into AI tools, along with successive refinements, to illustrate the author's personal mark.
  • Be careful with third-party content: avoid inputting copyrighted material into AI tools unless expressly authorised, as this could complicate authorship claims.

While the legal landscape continues to evolve, these steps can help mitigate risks and provide stronger evidence for copyright claims. As courts begin to address these issues, legal practitioners must remain vigilant in assessing how AI-generated works fit within existing copyright frameworks.