Alle Artikel
CE Marking and EU Database for AI Systems
CE Marking

CE Marking and EU Database for AI Systems

Guide to CE marking requirements and EU database registration for AI systems. Article 48, Article 49, Annex VIII, conformity declaration, and market access.

Legalithm Team28 Min. Lesezeit
Teilen
Lesezeit28 min
ThemaCE Marking
AktualisiertApr. 2026
Inhaltsverzeichnis

CE Marking and EU Database Registration for AI Systems: Complete Guide

TL;DR

  • CE marking is the gateway to the EU market. No high-risk AI system can be placed on the EU market or put into service without a valid CE mark after 2 August 2026. This applies to standalone AI systems classified under Annex III and to AI systems embedded in products already subject to EU harmonisation legislation listed in Annex I.
  • Article 48 governs how CE marking is affixed to AI systems — including digital-only products where a physical label is impossible. The mark must be visible, legible, and indelible, or accompany the system digitally.
  • Article 47 requires every provider to draw up an EU Declaration of Conformity before affixing the CE mark. The declaration must be kept up to date for as long as the system remains on the market.
  • Article 49 and Article 71 establish a mandatory EU database for high-risk AI systems. Both providers and certain deployers must register before the system is placed on the market or put into service.
  • Annex VIII lists the detailed information that must be submitted to the EU database — from provider identity and system description to conformity assessment outcomes and deployment status.
  • The conformity assessment route (self-assessment or notified body) determines what documentation feeds into the CE marking and registration process. See the conformity assessment guide for route selection.
  • Failing to CE-mark, failing to register, or providing inaccurate registration data can trigger fines of up to EUR 15 million or 3% of global annual turnover — whichever is higher.

What CE marking means for AI systems

CE marking is not new. For decades, it has been the mandatory conformity indicator for products sold within the European Economic Area — from medical devices and machinery to toys and radio equipment. The two letters stand for Conformité Européenne, and they signal that a product meets all applicable EU health, safety, and environmental requirements.

What is new is that the EU AI Act (Regulation (EU) 2024/1689) extends CE marking to AI systems for the first time. Under Article 48, high-risk AI systems must bear CE marking before they can be placed on the EU market or put into service. This includes software-only products — a significant departure from the physical-product origins of CE marking.

How the AI Act intersects with existing CE marking regimes

The AI Act does not operate in isolation. Many AI systems are embedded in products that already carry CE marking under existing EU harmonisation legislation — known as the "New Legislative Framework" (NLF). The Regulation explicitly addresses this overlap:

  • Annex I, Section A lists the Union harmonisation legislation that applies alongside the AI Act: the Machinery Regulation, Toy Safety Directive, Medical Devices Regulation (MDR), In Vitro Diagnostic Regulation (IVDR), Radio Equipment Directive, and others.
  • Annex I, Section B lists additional legislation where CE marking requirements interface with the AI Act but where the conformity assessment integration follows a different path.

When an AI system is a safety component of a product covered by Annex I, Section A legislation, the AI Act requirements are assessed as part of the existing conformity assessment for that product. The manufacturer does not affix a separate CE mark for the AI component — a single CE mark covers compliance with all applicable legislation, including the AI Act.

For standalone high-risk AI systems not embedded in a physical product (e.g., SaaS-based recruitment screening or credit scoring), CE marking under the AI Act is the primary — and often only — CE marking obligation.

The August 2026 deadline

The high-risk AI provisions in Chapter III of the AI Act, including Articles 47, 48, and 49, apply from 2 August 2026. After that date:

  • No new high-risk AI system can enter the EU market without CE marking.
  • No high-risk AI system can be put into service without prior EU database registration.
  • Systems already on the market before 2 August 2026 must comply when they undergo a substantial modification as defined in Article 3(23).

Organizations should be working toward compliance now. The conformity assessment, documentation, and registration processes take months to complete — not days. For a full timeline of key dates, see the EU AI Act timeline guide.

When does an AI system need CE marking?

Not every AI system requires CE marking. The obligation is triggered by the system's risk classification and its relationship to existing EU product regulation. Here is how to determine whether your system needs it.

Standalone high-risk AI systems (Annex III)

If your AI system falls within one of the high-risk categories enumerated in Annex III — and it is not embedded in a product subject to other EU harmonisation legislation — it requires CE marking under the AI Act alone.

Annex III covers areas such as:

  • Biometric identification and categorisation
  • Management and operation of critical infrastructure
  • Education and vocational training (access, assessment, monitoring)
  • Employment, worker management, and recruitment
  • Access to essential private and public services (credit scoring, insurance pricing, emergency dispatch)
  • Law enforcement, migration, asylum, and border control
  • Administration of justice and democratic processes

Example: A fintech company builds an AI-based credit-scoring model offered as a SaaS platform. This is a standalone high-risk AI system under Annex III, point 5(a). It needs CE marking under the AI Act. There is no separate product directive involved because the system is purely digital.

If you are unsure whether your system qualifies as high-risk, use the classification guide or the Legalithm AI Act assessment tool to determine your obligation.

AI embedded in products covered by Annex I harmonised legislation

When an AI system serves as a safety component of a product that is already subject to EU harmonisation legislation listed in Annex I, Section A, the CE marking for the product must also cover AI Act compliance. The manufacturer goes through a single, integrated conformity assessment that addresses both the product directive and the AI Act.

Example: A manufacturer produces an AI-powered surgical robot. The robot is a medical device under the Medical Devices Regulation (EU) 2017/745. The AI module that controls surgical arm movement is a safety component. The manufacturer's conformity assessment for the MDR must now also verify compliance with Articles 8–15 of the AI Act. A single CE mark covers both.

Products with AI Act + sector regulation

For products regulated under multiple frameworks, the interaction is nuanced. Below is a decision table to help you identify which CE marking rules apply.

System typePrimary legislationAI Act applies?CE marking route
Standalone HR screening SaaSAI Act onlyYes — Annex III, point 4CE marking under AI Act (Article 48)
AI-powered medical deviceMDR (EU) 2017/745 + AI ActYes — Annex I, Section AIntegrated conformity assessment; single CE mark
AI module in industrial machineryMachinery Regulation (EU) 2023/1230 + AI ActYes — Annex I, Section AIntegrated conformity assessment; single CE mark
AI in radio equipmentRadio Equipment Directive 2014/53/EU + AI ActYes — Annex I, Section AIntegrated conformity assessment; single CE mark
AI in children's toysToy Safety Directive 2009/48/EC + AI ActYes — Annex I, Section AIntegrated conformity assessment; single CE mark
AI chatbot (not high-risk)AI Act (transparency only)Transparency obligations, but not high-riskNo CE marking required
GPAI model (provider)AI Act Chapter VGPAI rules apply, not high-risk rulesNo CE marking required

The key takeaway: CE marking is triggered by high-risk classification. If your AI system is not high-risk, CE marking under the AI Act is not required — though transparency and other obligations may still apply.

Article 48 — CE marking of conformity

Article 48 is the specific provision that mandates CE marking for high-risk AI systems. It is short but precise, and its requirements interlock with the broader CE marking framework established by Regulation (EC) No 765/2008.

What CE marking indicates

When a provider affixes CE marking to a high-risk AI system, they are formally declaring that:

  1. The system conforms to the requirements laid down in Chapter III, Section 2 of the AI Act (Articles 8–15).
  2. A conformity assessment has been successfully completed under Article 43 — either through self-assessment (Annex VI) or notified body assessment (Annex VII).
  3. An EU Declaration of Conformity has been drawn up in accordance with Article 47.
  4. The system has been registered in the EU database in accordance with Article 49.

CE marking is not merely a label — it is a legal representation. Affixing it without having completed the underlying obligations is an offence that can result in significant fines and forced withdrawal from the market.

Visibility and affixing requirements

Article 48 sets out the following rules for how CE marking is applied:

  • Physical products: The CE mark must be affixed visibly, legibly, and indelibly to the high-risk AI system or its data plate. If the physical nature of the system does not permit this, the mark must be affixed to the packaging or accompanying documentation.
  • Digital/software-only systems: When the AI system is provided only digitally (which is the case for most SaaS AI platforms), CE marking must be included in the digital documentation accompanying the system. This includes the terms of service, user interface, or other digital means that are easily accessible before purchase or deployment.
  • Existing product directives: When the AI system is part of a product that already requires CE marking under Annex I legislation, a single CE mark is used to indicate conformity with all applicable legislation — the AI Act plus the product directive. There is no requirement for a separate, additional CE mark for the AI component.

Relationship to EU Declaration of Conformity

CE marking and the EU Declaration of Conformity are two sides of the same coin. The declaration (Article 47) is the legal document that underpins the mark. The mark (Article 48) is the visible indicator of conformity. You cannot have one without the other.

A provider who affixes CE marking without a valid declaration, or who issues a declaration that does not correspond to the system's actual state of compliance, is in breach of the AI Act. Market surveillance authorities can — and will — request to see the declaration during inspections, complaint investigations, or random market checks.

EU Declaration of Conformity (Article 47)

The EU Declaration of Conformity is a formal, written statement by the provider that the high-risk AI system meets all applicable requirements of the AI Act. It is a prerequisite for CE marking and a key compliance artefact that market surveillance authorities will review.

Required contents

Under Article 47, the declaration must contain, at minimum:

  1. Provider identification — name, registered trade name or trademark, and the address at which the provider can be contacted.
  2. Confirmation of sole responsibility — a statement that the declaration is issued under the sole responsibility of the provider.
  3. System identification — the AI system's name, type, and any additional unambiguous reference allowing identification (including software version and hardware version where applicable).
  4. Statement of conformity — a declaration that the high-risk AI system is in conformity with Chapter III, Section 2 of the AI Act, and where relevant, with other Union harmonisation legislation.
  5. References to harmonised standards or common specifications — identification of any harmonised standards (Article 40) or common specifications (Article 41) applied during the conformity assessment.
  6. Notified body information (where applicable) — the name, identification number, and details of the certificate issued by the notified body, if a notified body was involved in the conformity assessment (Annex VII route).
  7. Place and date of issue — the location where the declaration was drawn up and the date.
  8. Signatory — the name, position, and signature of the person authorised to sign on behalf of the provider.

Template structure

While the AI Act does not prescribe a rigid template, a well-structured declaration typically follows this format:

EU DECLARATION OF CONFORMITY

FieldContent
Document reference[Unique document identifier]
Provider name[Legal entity name]
Provider address[Registered address]
System name[AI system name and version]
System description[Brief description of intended purpose]
Conformity statementThis AI system is in conformity with the requirements of Regulation (EU) 2024/1689, Chapter III, Section 2.
Harmonised standards applied[List EN/ISO standards, if any]
Common specifications applied[List common specifications, if any]
Notified body (if applicable)[Name, ID number, certificate reference]
Additional legislation[Other EU directives/regulations the system complies with, if applicable]
Place and date[City, Country — Date]
Authorised signatory[Name, Title, Signature]

For detailed guidance on preparing the technical documentation that supports this declaration, see the Annex IV technical documentation template.

Language requirements

The declaration must be translated into the language(s) required by the Member State(s) in which the AI system is placed on the market or put into service. For a system marketed across the EU, this may mean translations into all 24 official EU languages — though in practice, most providers start with English and the languages of their primary target markets.

When the AI system is part of a product subject to additional EU product legislation, the language requirements of that legislation also apply.

Continuous updating obligation

The EU Declaration of Conformity is not a one-time document. Article 47(3) mandates that the provider must update the declaration whenever necessary to reflect the current state of the AI system. This includes:

  • Updates following a substantial modification to the AI system.
  • Changes to the harmonised standards or common specifications referenced.
  • Changes in notified body certification status.
  • Expansion of deployment to new Member States with additional language requirements.

The declaration must be kept for 10 years after the AI system has been placed on the market or put into service — whichever is later.

EU database registration for high-risk AI systems

Beyond CE marking and the Declaration of Conformity, the AI Act creates a centralised EU database for high-risk AI systems. This database serves dual purposes: it enables market surveillance authorities to monitor the AI systems available in the EU, and it provides a degree of public transparency about which high-risk systems are in operation.

Article 71 — the EU database explained

Article 71 establishes the legal basis for the EU database for high-risk AI systems. The database is developed and maintained by the European Commission, in collaboration with Member States, and is publicly accessible — though certain information (e.g., deployer registration data for law enforcement systems) may be restricted to national competent authorities.

Key characteristics of the database:

  • Publicly accessible — the non-restricted portions are searchable by anyone, including businesses, consumers, civil society organisations, and researchers.
  • Machine-readable — the data is provided in a format that allows automated processing, enabling integrations with compliance management tools and market surveillance workflows.
  • Linked to national authorities — national market surveillance authorities have access to the full dataset, including restricted entries, to support enforcement activities.

The database is the primary mechanism through which the EU aims to create a registry of high-risk AI deployed across the single market. This is a significant step toward the kind of transparency that stakeholders have demanded regarding AI systems that affect fundamental rights.

Who must register

Registration obligations fall on two categories of actors:

1. Providers of high-risk AI systems

Every provider of a high-risk AI system must register the system in the EU database before placing it on the market or putting it into service. This applies regardless of whether the provider is established in the EU or in a third country (third-country providers typically register through their authorised representative).

2. Deployers that are public bodies or act on behalf of public bodies

Under Article 49(3), deployers of high-risk AI systems who are themselves public authorities, Union institutions, bodies, offices, or agencies, or who deploy on behalf of such entities, must also register in the EU database. This is a deployer-side obligation that goes beyond the provider's initial registration.

Example — public sector deployer: A national employment agency that deploys an AI system for benefit eligibility assessment (Annex III, point 5(b)) must register as a deployer in the EU database, even though the AI system's provider has already registered the system. The deployer registration captures how and where the system is being deployed in practice.

Private-sector deployers are generally not required to register in the EU database, though they may be subject to other record-keeping requirements under Article 26.

When to register

The timing is critical and strictly sequenced:

  • Providers must register before placing the AI system on the market or putting it into service — per Article 49(1).
  • Public-sector deployers must register before putting the system into service — per Article 49(3).
  • Registration must be updated whenever the registered information changes — including status changes (e.g., system recalled, withdrawn, or no longer on the market).

Placing a system on the market without prior registration is a compliance violation and can trigger enforcement action by market surveillance authorities, separate from any penalties for missing CE marking.

Annex VIII — required registration information

Annex VIII provides the exhaustive list of information that providers must submit when registering a high-risk AI system. The requirements are detailed and span operational, technical, and legal domains.

Provider and representative details

Required informationDescription
Provider name and addressLegal name, registered address of the provider
Authorised representative (if any)Name and address of the EU-based authorised representative for third-country providers
Contact personName and contact details of a designated contact person
Provider statusWhether the provider is a natural person, SME, micro-enterprise, or startup — relevant for proportionality provisions

System identification and description

Required informationDescription
AI system nameTrade name or brand name under which the system is marketed
System type and versionUnique identifier, type designation, and version number
Intended purposeClear, detailed description of the system's intended purpose as defined in the instructions for use
System descriptionNon-technical summary explaining what the system does and how it works, understandable to a general audience
System statusWhether the system is on the market, in service, recalled, withdrawn, or no longer available
Annex III classificationThe specific point in Annex III under which the system is classified as high-risk

Conformity assessment details

Required informationDescription
Assessment routeWhether self-assessment (Annex VI) or notified body assessment (Annex VII) was used
Notified body (if applicable)Name, identification number of the notified body involved
Certificate reference (if applicable)Number and expiry date of the certificate issued by the notified body
Harmonised standards / common specificationsReferences to the standards or specifications applied

Deployment and geographic information

Required informationDescription
Member States of availabilityThe EU Member States in which the system is or will be placed on the market or put into service
Dataset informationWhere available, the URL or reference to datasets used for training, validation, and testing (Article 10 compliance)
EU Declaration of Conformity URLA link or reference to the publicly accessible EU Declaration of Conformity
Instructions for useURL or reference to the instructions for use (Article 13)

The registration information must be accurate, complete, and current. Providers are responsible for updating the database when any registered information changes — including system modifications, status changes, or geographic expansion. Knowingly providing inaccurate data to the EU database is itself a violation of the AI Act, subject to administrative fines.

The conformity assessment pathway to CE marking

CE marking is the end point of the conformity assessment process — not the starting point. Before a provider can affix the mark, the system must pass through a conformity assessment that verifies compliance with all Chapter III, Section 2 requirements. The AI Act provides two principal assessment routes, plus an integrated path for products with existing EU legislation.

Self-assessment (most Annex III systems)

The self-assessment route (also called internal control, defined in Annex VI) is the default for most high-risk AI systems classified under Annex III. Under this route, the provider performs the conformity assessment internally:

  • The provider's own compliance team reviews the quality management system, examines technical documentation, and verifies that the design and development processes match what is documented.
  • No external auditor or certification body is required.
  • The provider issues the EU Declaration of Conformity and affixes CE marking on their own authority.

When self-assessment is available: For all Annex III high-risk AI systems except those falling under point 1 (biometric identification systems used for remote biometric identification by law enforcement or other categories designated by Article 43(1)), self-assessment is permitted.

Advantages: Faster (typically 2–8 weeks for a well-prepared provider), lower direct cost (EUR 5,000–30,000 in internal effort), and the provider retains full control of the timeline.

Risks: No external validation. If market surveillance authorities later determine that the assessment was inadequate, the provider bears full liability. For a deeper comparison, see the self-assessment vs notified body guide.

Notified body assessment (biometrics, critical infrastructure)

The notified body route (Annex VII) is mandatory for:

  • Remote biometric identification systems classified under Annex III, point 1 — specifically those intended for use in law enforcement or similar public-safety contexts.
  • AI systems that are safety components of products already requiring third-party conformity assessment under Annex I, Section A harmonisation legislation — where the existing product directive mandates notified body involvement.

Under this route:

  • A designated notified body (an independent assessment organisation accredited by a Member State under Article 28) reviews the provider's quality management system and technical documentation.
  • The notified body may conduct testing, audit the development process, and verify ongoing compliance mechanisms.
  • If the system meets requirements, the notified body issues a certificate that is valid for up to five years, renewable upon re-assessment.
  • The certificate reference and notified body identification number appear in the EU database registration and the EU Declaration of Conformity.

Timelines and costs: Notified body assessments typically take 6–24 months and cost EUR 15,000–100,000+, depending on system complexity and the notified body's capacity.

Voluntary engagement: Even when self-assessment is permitted, a provider may voluntarily engage a notified body. This is strategically useful for systems deployed in sensitive contexts (e.g., healthcare, financial services) where external validation strengthens trust with customers, regulators, and the public.

Products with existing EU product legislation

For AI systems embedded in products subject to Annex I, Section A legislation, the conformity assessment integrates the AI Act requirements into the existing product assessment procedure.

How this works in practice:

  • The manufacturer follows the conformity assessment procedure set out in the relevant product directive (e.g., the MDR's clinical evaluation pathway for medical devices, or the Machinery Regulation's risk assessment pathway for industrial machinery).
  • Within that assessment, the manufacturer additionally verifies that the AI component meets Articles 8–15 of the AI Act.
  • If the product directive requires a notified body, that same notified body also assesses AI Act compliance.
  • The resulting CE mark covers compliance with all applicable legislation.
Assessment pathwayWho uses itExternal body required?Typical timeline
Self-assessment (Annex VI)Most Annex III high-risk systemsNo2–8 weeks
Notified body (Annex VII)Biometric ID systems; AI in products requiring third-party assessmentYes6–24 months
Integrated assessmentAI embedded in Annex I, Section A productsDepends on product directiveVaries by product regulation

Step-by-step process: from development to CE mark

The journey from AI system development to a valid CE mark and EU database registration follows a structured sequence. Skipping or misordering steps creates compliance gaps that market surveillance authorities will identify.

Step 1 — Classify your AI system

Before anything else, determine whether your AI system is high-risk under the AI Act. This requires evaluating:

  • Does the system fall within one of the areas listed in Annex III?
  • Is the system a safety component of a product covered by Annex I harmonisation legislation?
  • Does any exemption under Article 6(3) apply? (Systems performing narrow procedural tasks, systems that assist but do not replace human decisions, or systems used for purely preparatory functions may be excluded.)

Tool: Use the Legalithm AI Act classification assessment to determine your system's risk level in minutes.

If the system is not high-risk, CE marking under the AI Act is not required — but transparency obligations (Article 50), general-purpose AI rules (Chapter V), or prohibited practice rules (Article 5) may still apply. See the EU AI Act compliance checklist for a full obligations map.

Step 2 — Comply with Chapter III requirements

Once classified as high-risk, the provider must implement the substantive requirements of Chapter III, Section 2 — these require engineering, operational, and governance changes:

  • Risk management (Article 9): Continuous, iterative risk management across the AI system lifecycle.
  • Data governance (Article 10): Training, validation, and testing datasets meeting quality criteria. See the data governance guide.
  • Technical documentation (Article 11, Annex IV): Comprehensive documentation per the Annex IV template.
  • Logging (Article 12): Automatic logging enabling traceability.
  • Transparency (Article 13): Clear instructions for deployers on capabilities and limitations.
  • Human oversight (Article 14): Design enabling effective human monitoring and intervention. See the human oversight guide.
  • Accuracy, robustness, cybersecurity (Article 15): Appropriate levels of each, documented and tested.

The provider must also establish a quality management system (Article 17) and post-market monitoring (Article 72) before the conformity assessment begins.

Step 3 — Prepare technical documentation

Technical documentation under Annex IV is the evidentiary backbone of the compliance process. It must be prepared before the conformity assessment begins, covering: general system description and intended purpose; development process details (training methodologies, data, design choices, architecture); monitoring, functioning, and control information including accuracy levels; risk management measures; lifecycle change history; harmonised standards applied; a copy of the EU Declaration of Conformity; and the post-market monitoring system description.

Documentation must be retained for 10 years and provided to competent authorities on request. See the Annex IV documentation template guide for a detailed walkthrough.

Step 4 — Conduct conformity assessment

With compliance achieved and documentation prepared, conduct the conformity assessment under Article 43:

  • Self-assessment (Annex VI): Internally review the QMS, verify documentation against each requirement, and confirm that processes match documentation. Rigorous and documented — but no external auditor required.
  • Notified body (Annex VII): Engage an accredited body that reviews QMS, evaluates documentation, may conduct testing, and issues a certificate. Plan well in advance — capacity may be constrained near the August 2026 deadline.

The assessment must be fully completed before proceeding. A partial assessment is not sufficient for the Declaration of Conformity.

Step 5 — Draw up EU Declaration of Conformity

After successful conformity assessment, draw up the EU Declaration of Conformity under Article 47 in the required language(s), containing all mandatory information detailed above, signed by an authorised person, and retained for 10 years. The declaration URL is submitted as part of the EU database registration.

Step 6 — Affix CE marking

With the declaration in place, the provider affixes CE marking under Article 48:

  • Physical products: Apply the CE mark visibly and indelibly on the product, its data plate, or packaging.
  • Digital/software-only systems: Include CE marking in the digital documentation — the system's user interface, terms of service, or other accessible digital location.
  • Integrated products: Use a single CE mark covering both the product directive and the AI Act.

The CE mark must follow the visual design prescribed by Regulation (EC) No 765/2008, Article 30: the letters "CE" in the specified proportional format, with a minimum height of 5 mm (for physical affixing). It must not be confused with other marks or rendered illegible.

Step 7 — Register in EU database

The final step — and one that must be completed before the system is placed on the market — is registration in the EU database under Article 49.

The provider submits all Annex VIII information to the EU database:

  • Provider and representative identification.
  • AI system name, version, type, and intended purpose.
  • Risk classification reference (Annex III category).
  • Conformity assessment details and notified body certificate (if applicable).
  • Member States where the system will be made available.
  • System status.
  • Link to the EU Declaration of Conformity and instructions for use.

Important: The registration must be completed before the system is placed on the market or put into service. This is a hard sequencing requirement — not a "register when you get around to it" obligation.

After registration, the provider receives a unique EU database registration number that should be referenced in the system's documentation and made available to deployers.

Common pitfalls and compliance mistakes

Based on emerging compliance patterns and guidance from national authorities, the following are the most frequent mistakes organisations make with CE marking and EU database registration:

1. Treating CE marking as a formality rather than a legal commitment. CE marking is a legal declaration that all applicable requirements have been met. Affixing it prematurely — before the conformity assessment is complete, before the declaration is drawn up, or before the EU database registration is submitted — is a serious violation, not an administrative oversight.

2. Failing to update the EU database after system modifications. The registration obligation is not one-and-done. When the system undergoes a substantial modification (Article 3(23)), when its status changes (e.g., recalled, withdrawn), or when it is deployed in new Member States, the database entry must be updated. Stale or inaccurate data is itself a compliance violation.

3. Misidentifying the conformity assessment route. Providers who should use the notified body route (e.g., for biometric identification systems) but instead perform self-assessment will find that their CE marking is invalid. Conversely, providers who unnecessarily engage a notified body may incur avoidable costs and delays. Correct route identification is essential — see the conformity assessment guide.

4. Neglecting the declaration of conformity language requirements. A declaration that exists only in English but accompanies a system deployed in France, Germany, and Italy may not satisfy Member State language requirements. Providers must anticipate their target markets and prepare translations.

5. Not registering before placing on the market. The EU database registration must happen before the system enters the market — not simultaneously, not shortly after. Market surveillance authorities can verify the timestamp, and late registration is an independent compliance failure.

6. Overlooking deployer registration obligations for public bodies. Public-sector deployers often assume that the provider's registration is sufficient. It is not. Article 49(3) creates a separate registration obligation for deployers who are public authorities or act on behalf of public authorities.

7. Confusing CE marking under the AI Act with CE marking under product directives. For AI systems embedded in products, a single CE mark covers all applicable legislation. Affixing multiple CE marks — one for the AI Act and another for the product directive — is incorrect and non-compliant with Regulation (EC) No 765/2008.

8. Failing to retain documentation for the required 10-year period. Both the EU Declaration of Conformity and the technical documentation must be retained for 10 years from when the system was placed on the market or put into service. Organisations that rotate documentation systems, change cloud providers, or restructure without preserving these records create latent compliance risks.

For a comprehensive overview of all penalties associated with non-compliance, see the EU AI Act penalties and fines guide.

Frequently asked questions

Does CE marking apply to AI systems that are only deployed within one EU Member State?

Yes. CE marking is required for placing a high-risk AI system on the market or putting it into service within any EU Member State. There is no exemption for single-country deployment. The CE mark indicates compliance with EU-wide requirements, regardless of geographic scope within the Union.

Can I use an existing CE mark on my product to cover AI Act compliance?

Only if the conformity assessment that produced the existing CE mark also evaluated AI Act requirements. If you have a CE mark under the Medical Devices Regulation but have not assessed the AI component against Articles 8–15 of the AI Act, the existing mark does not cover AI Act compliance. After 2 August 2026, the conformity assessment must be updated to integrate AI Act requirements, and the CE mark then covers both.

What happens if I place a high-risk AI system on the market without CE marking?

Market surveillance authorities can order the system's withdrawal or recall from the market. The provider may face administrative fines of up to EUR 15 million or 3% of global annual turnover (whichever is higher) under Article 99(3). Additionally, non-compliant systems that cause harm expose the provider to civil liability claims.

Is the EU database registration publicly visible?

Partially. The general public can access most registration data, including the provider's identity, the system's intended purpose, its risk classification, and its conformity assessment status. However, certain registrations — particularly those related to law enforcement and border control AI systems — are restricted to national competent authorities only, and the non-public section of the database is accessible solely to market surveillance authorities.

How often must the EU database registration be updated?

There is no fixed schedule. The obligation is event-driven: the provider must update the registration whenever the registered information changes. This includes system modifications, status changes (e.g., from "on the market" to "recalled"), changes to the conformity assessment (e.g., certificate renewal), and expansion to new Member States. Providers should establish internal processes to trigger database updates when any relevant change occurs.

Do third-country (non-EU) providers need to CE-mark and register?

Yes. The AI Act applies to providers who place high-risk AI systems on the EU market, regardless of where the provider is established. A US-based, Chinese-based, or UK-based provider selling into the EU must comply with CE marking (Article 48) and EU database registration (Article 49) in full. Third-country providers must appoint an authorised representative established in the EU under Article 22, who is responsible for certain compliance tasks including database registration.

Next steps

CE marking and EU database registration are the culmination of a comprehensive compliance effort — not standalone tasks. They depend on correct risk classification, substantive compliance with technical requirements, thorough documentation, and a properly executed conformity assessment.

If you are preparing for the August 2026 deadline, start by understanding where your AI systems fall within the classification framework. Use the Legalithm AI Act assessment tool to identify your obligations, then work through the compliance requirements systematically.

Related guides:

CE Marking
EU Database
AI Act
Article 49
Conformity
Market Access
Registration

Prüfen Sie die Compliance Ihres KI-Systems

Kostenlose Bewertung ohne Signup. Erhalten Sie Ihre Risikoklassifizierung in wenigen Minuten.

Kostenlose Bewertung starten