Chapter III, Section 2 — Requirements for high-risk AI systemsArticle 8

Article 8: Compliance with the requirements

Applies from 2 Aug 20266 min readEUR-Lex verified Apr 2026

Article 8 is the umbrella for Chapter III, Section 2: high-risk AI systems must comply with the requirements in that Section in light of their intended purpose and the generally acknowledged state of the art. Article 9’s risk management system must be taken into account when demonstrating compliance. Where a product embeds an AI system subject both to the AI Act and to Annex I Section A harmonisation laws, providers must ensure full compliance with every applicable Union requirement—and may integrate AI Act testing and documentation into existing product conformity processes where Article 8(2) allows.

Who does this apply to?

  • -Providers of high-risk AI systems under Article 6 placing products on the Union market or putting them into service
  • -Manufacturers integrating AI into products regulated under Annex I Section A harmonisation legislation
  • -Quality, safety, and regulatory affairs teams aligning AI Act files with CE / product conformity workflows

Scenarios

A medical device software module is high-risk under Annex III and also a medical device under Regulation (EU) 2017/745 (MDR).

Article 8(2) requires one coherent compliance story across AI Act Section 2 and MDR—not duplicate silos where integration is permitted.
Ref. Art. 8(2)

A provider documents Annex IV technical files but omits lifecycle risk updates required by Article 9.

Article 8(1) explicitly links Section 2 compliance to the risk management system; gaps undermine the conformity narrative.
Ref. Art. 8(1)

A robotics OEM argues state of the art is 'whatever the startup shipped last month' without method or sources.

Article 8(1) expects demonstrable alignment with generally acknowledged state of the art for AI and related technologies—document methodology and references.
Ref. Art. 8(1)

What Article 8 does (in plain terms)

Paragraph 1 states the master rule: high-risk AI systems must meet all Section 2 requirements (Articles 9–15 and related annexes) with intended purpose and state of the art in view, and must embed the Article 9 risk management system in that compliance story.

Paragraph 2 addresses combined product + AI Act situations: if Union harmonisation legislation listed in Annex I Section A also applies, the provider must ensure the product satisfies both sets of rules. The second subparagraph encourages integration of testing, reporting, information, and documentation into existing conformity procedures under that harmonisation law to avoid duplication—read the exact conditions on EUR-Lex.

Use Article 8 as the table of contents for your evidence map, then drill into Articles 9–15.

How Article 8 connects to the rest of the Act

  • Article 6When Section 2 applies.
  • Article 9 through Article 15Substantive Section 2 requirements Article 8 summarises.
  • Annex ISection A product laws referenced in Article 8(2).
  • Annex IVTechnical documentation is a core evidence pillar alongside Section 2.
  • Articles 16–27Obligations of providers, deployers, importers, distributors once high-risk duties bite.
  • Article 43Conformity assessment routes interact with how you prove Section 2 compliance.
  • Article 113Dates when many high-risk rules become operational.

Practical checklist

  • Build a matrix: each Article 9–15 requirement → owner → artefact → review cadence.
  • For embedded products, assign a single responsible compliance lead who owns both AI Act and sectoral files (Article 8(2)).
  • Map state of the art claims to public standards, peer-reviewed methods, and internal benchmarks; refresh at major releases.
  • Where sectoral NB (notified body) processes exist, design shared evidence packs permitted under Article 8(2) second subparagraph—verify wording on EUR-Lex.
  • Link risk management outputs directly to test protocols and release gates.

Official wording (excerpt): Article 8(1)–(2)

Editorial note: The following reproduces Article 8(1) and (2) from the English consolidated text of Regulation (EU) 2024/1689. If the OJ layout differs, EUR-Lex Article 8 prevails.

1. High-risk AI systems shall comply with the requirements laid down in this Section, taking into account their intended purpose as well as the generally acknowledged state of the art on AI and AI-related technologies. The risk management system referred to in Article 9 shall be taken into account when ensuring compliance with those requirements.
2. Where a product contains an AI system, to which the requirements of this Regulation as well as requirements of the Union harmonisation legislation listed in Section A of Annex I apply, providers shall be responsible for ensuring that their product is fully compliant with all applicable requirements under applicable Union harmonisation legislation.
In ensuring the compliance of high-risk AI systems referred to in paragraph 1 with the requirements set out in this Section, and in order to ensure consistency, avoid duplication and minimise additional burdens, providers shall have a choice of integrating, as appropriate, the necessary testing and reporting processes, information and documentation they provide with regard to their product into documentation and procedures that already exist and are required under the Union harmonisation legislation listed in Section A of Annex I.

Recitals (preamble) on EUR-Lex

The recitals in the same consolidated AI Act on EUR-Lex explain proportionality, alignment with product law, and state of the art expectations that underpin Article 8. Use the official preamble on EUR-Lexdo not rely on unofficial recital lists without checking sequence and wording against the authentic text.

Compliance checklist

  • Maintain one cross-walk from Article 8 to Articles 9–15 with owners and evidence locations.
  • For Annex I products, list every overlapping directive/regulation and close gaps jointly with product safety teams.
  • Version-control integrated documentation packs when merging AI Act and sectoral conformity procedures.
  • Record how Article 9 risk outputs informed each Section 2 control.
  • Re-validate after substantial modifications (Article 12 triggers may also apply).

Stress-test your Section 2 evidence map with our free assessment.

Start Free Assessment

Related annexes

  • Annex IV — Technical documentation

Frequently asked questions

Is Article 8 a separate checklist from Articles 9–15?

It frames how to read Section 2 as a whole and adds the product-law integration rule in paragraph 2. Your substantive duties still come from Articles 9–15 and the annexes.

Can we skip AI Act documentation if MDR technical files exist?

Only within the integration choice permitted by Article 8(2)’s second subparagraph and only where processes genuinely overlap—never assume equivalence without mapping each AI Act requirement.

Does Article 8 apply to AI that is part of a product covered by EU safety law?

Yes. Article 8 requires high-risk AI systems that are safety components of products under Annex I harmonisation legislation to comply with both the AI Act requirements (Articles 9-15) and the applicable product-safety legislation. The conformity assessment may be integrated into the existing product conformity process.