AI Governance Institute logo
AI Governance Institute

Practical Governance for Enterprise AI

VoluntaryFrameworkEULimited riskMinimal risk

EU Code of Practice on Marking and Labelling of AI-Generated Content

Issued by

European Commission

liveEU-AIGC-CoPVerified April 2026

The European Commission published a first draft of a voluntary code of practice establishing common standards for marking and labelling AI-generated content across the EU. It targets organisations that produce, distribute, or deploy AI systems capable of generating text, images, audio, or video. The code aims to improve transparency and consumer awareness by requiring identifiable disclosure when content is AI-generated.

Applies To

Large enterpriseSMBAI developerAI deployer

Overview

The EU Code of Practice on Marking and Labelling of AI-Generated Content is a voluntary instrument developed under the broader EU AI Act ecosystem, intended to operationalise transparency obligations related to synthetic and AI-generated material. The first draft, published in early 2026, sets out common technical and procedural standards for how AI-generated content should be marked, labelled, or otherwise disclosed to end users and the public. The code applies to a wide range of actors including AI developers, platform operators, media distributors, and enterprise deployers who use generative AI tools in commercial or public-facing contexts. Although voluntary in formal status, adherence to the code is expected to function as a compliance signal under the AI Act's transparency requirements, particularly for providers of general-purpose AI models and systems used in high-volume content generation. The European Commission is overseeing a multi-stakeholder drafting process, with the final version expected to be adopted following a consultation period. Non-participating organisations may face greater regulatory scrutiny when transparency-related provisions of the AI Act are enforced by national market surveillance authorities.

Key Requirements

  • Disclose to end users, via visible and accessible labelling, that content has been generated or substantially modified by an AI system
  • Apply machine-readable metadata or watermarking standards to AI-generated text, images, audio, and video where technically feasible
  • Maintain records of labelling practices to demonstrate compliance upon request by regulators or auditors
  • Implement internal policies and staff training to ensure consistent application of marking standards across content production workflows
  • Align labelling practices with any technical specifications issued by the European Commission or relevant standardisation bodies under the AI Act framework
  • Review and update marking practices when new versions of the code or associated technical standards are published

What Your Organization Must Do

  • Assign a named compliance lead, such as a Chief Compliance Officer or AI Governance Manager, to monitor the EU Code of Practice drafting process and track updates from the European Commission ahead of final adoption expected in 2026.
  • Conduct an inventory of all internal generative AI tools and third-party systems used to produce or distribute text, images, audio, or video in commercial or public-facing contexts, identifying every touchpoint where disclosure labelling must be applied.
  • Implement visible, user-facing AI content labels and machine-readable metadata or watermarking on AI-generated outputs where technically feasible, prioritising high-volume or high-visibility content pipelines before the final code is adopted to reduce remediation risk.
  • Establish a recordkeeping process to log labelling decisions, tools used, and responsible teams for each content type, ensuring audit-ready documentation is available to national market surveillance authorities on request.
  • Develop and deliver staff training on AI content marking standards for all teams involved in content creation, editorial review, and platform distribution, with completion tracked and refreshed whenever the code or associated technical specifications are updated.
  • Formally register or engage with the European Commission multi-stakeholder consultation process to influence technical specifications and receive early notice of binding updates, reducing the risk of greater regulatory scrutiny for non-participating organisations under the AI Act transparency provisions.

Playbook Guidance

Step-by-step implementation guidance for compliance teams.

Frequently Asked Questions

Is the EU Code of Practice on AI-Generated Content Marking legally binding?
No, the code is formally voluntary. However, adherence is expected to signal compliance with the EU AI Act's transparency obligations, and non-participating organisations may attract greater scrutiny from national market surveillance authorities when those provisions are enforced.
Which types of organisations does the EU-AIGC-CoP apply to?
The code targets AI developers, platform operators, media distributors, and enterprise deployers that use generative AI to produce or distribute text, images, audio, or video in commercial or public-facing contexts. It applies to both large enterprises and SMBs operating in the EU.
What specific labelling actions does the code require organisations to take?
Organisations must apply visible, user-facing disclosures on AI-generated content and embed machine-readable metadata or watermarking where technically feasible. They must also maintain records of labelling decisions and implement internal policies and staff training to ensure consistent application.
When is the final version of the EU-AIGC-CoP expected to be adopted?
The European Commission published a first draft in early 2026 and is running a multi-stakeholder consultation process. The final version is expected to be adopted following completion of that consultation, though no binding effective date has been confirmed yet.
How does the EU-AIGC-CoP relate to the EU AI Act transparency requirements?
The code is designed to operationalise the AI Act's existing transparency obligations for AI-generated content, particularly for providers of general-purpose AI models used in high-volume content generation. Compliance with the code is expected to serve as a practical benchmark for meeting those statutory duties.
What documentation should companies retain to satisfy the EU-AIGC-CoP audit requirements?
Organisations should log labelling decisions, the generative AI tools used, content types covered, and the teams responsible for each workflow. This documentation must be available on request from national market surveillance authorities or auditors reviewing AI Act compliance.