Chapter III, Section 2 — Requirements for high-risk AI systemsArticle 12

Article 12: Record-keeping

Applies from 2 Aug 20266 min readEUR-Lex verified Apr 2026

Article 12 requires high-risk AI systems to technically enable automatic logging of events (logs) over the lifetime of the system. Logging must be appropriate to the intended purpose and support traceability for: (a) situations that may present a risk within Article 79(1) or indicate a substantial modification; (b) Article 72 post-market monitoring; and (c) monitoring under Article 26(5). For remote biometric identification systems under Annex III point 1(a), Article 12(3) adds minimum log fields (use periods, reference database, matching inputs, verifier identity). Align implementation with Article 13(3)(f) on deployer-side log interpretation.

Who does this apply to?

  • -Providers who must build logging capabilities into high-risk AI systems before placing them on the market or putting them into service
  • -Deployers responsible for operating logs under Article 26 and for verification workflows under Article 14(5) where Annex III point 1(a) applies

Scenarios

A credit scoring engine stores only aggregate daily metrics with no per-inference trace for anomaly investigation.

Likely insufficient for Article 12(1)–(2) traceability expectations for the intended Annex III use—design logs with purpose-appropriate events.
Ref. Art. 12(1)–(2)

A biometric watchlist system records session windows, database references, matched inputs, and verifier IDs.

Aligned with Article 12(3) minimum fields for Annex III point 1(a)—confirm exact wording on EUR-Lex.
Ref. Art. 12(3)

What Article 12 requires (in plain terms)

Paragraph 1: High-risk systems must technically allow for automatic recording of events (logs) across the entire lifetime.

Paragraph 2: Logging capabilities must enable recording of events relevant to:

  • (a) Identifying situations that may lead to a risk under Article 79(1) or a substantial modification
  • (b) Supporting Article 72 post-market monitoring
  • (c) Monitoring under Article 26(5)

The depth and granularity must be proportionate to the intended purpose—always verify lists and definitions on EUR-Lex Article 12.

Paragraph 3: Additional minimum logging rules apply to systems referred to in Annex III point 1(a) (remote biometric identification in law-enforcement contexts in the annex wording—confirm category on EUR-Lex).

How Article 12 connects to the rest of the Act

  • Article 11 — Logging design and retention should be reflected in Annex IV and change control.
  • Article 13Article 13(3)(f) requires information enabling deployers to collect, store, and interpret logs per Article 12.
  • Article 14(5)Two-person verification context for Annex III point 1(a) ties to log fields in Article 12(3)(d).
  • Article 72 — Logs feed post-market evidence.
  • Article 73 — Serious-incident workflows may rely on log exports (where applicable).
  • Annex III — Determines whether Article 12(3) minimums apply.
  • Article 113Application dates.

Practical checklist

  • Define an event schema (who/what/when/outcome/model version) aligned with Article 12(2)(a)–(c).
  • Implement tamper-evident storage, access control, and retention aligned with GDPR and sector law where personal data is logged.
  • For Annex III point 1(a), implement Article 12(3) fields and test end-to-end with Article 14(5) procedures.
  • Document how deployers will access and interpret logs in the Article 13 IFU.
  • Run tabletop exercises: can you reconstruct an incident timeline from logs alone?

Official wording (excerpt): Article 12(1)–(3) (English)

Editorial note: The following reproduces Article 12(1) through (3) from the English consolidated text of Regulation (EU) 2024/1689. If numbering or annex cross-references differ in your EUR-Lex view, the authentic HTML prevails. Always re-open EUR-Lex before compliance decisions.

1. High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system.
2. In order to ensure a level of traceability of the functioning of a high-risk AI system that is appropriate to the intended purpose of the system, logging capabilities shall enable the recording of events relevant for:
(a) identifying situations that may result in the high-risk AI system presenting a risk within the meaning of Article 79(1) or in a substantial modification;

(b) facilitating the post-market monitoring referred to in Article 72; and

(c) monitoring the operation of high-risk AI systems referred to in Article 26(5).

3. For high-risk AI systems referred to in point 1 (a), of Annex III, the logging capabilities shall provide, at a minimum:

(a) recording of the period of each use of the system (start date and time and end date and time of each use);

(b) the reference database against which input data has been checked by the system;

(c) the input data for which the search has led to a match;

(d) the identification of the natural persons involved in the verification of the results, as referred to in Article 14(5).

Further paragraphs or subpoints (if any in the consolidated act) — confirm on EUR-Lex Article 12.

Recitals (preamble) on EUR-Lex

The recitals in the same consolidated AI Act on EUR-Lex contextualise traceability, market surveillance, and biometric identification safeguards linked to logging. Use the official preamble on EUR-Lexdo not rely on unofficial recital lists without checking sequence and wording against the authentic text.

Compliance checklist

  • Ship logging hooks before first placement; do not retrofit as an afterthought.
  • Map log events to Article 79 risk scenarios and change-management triggers.
  • Align retention and access with GDPR and national sector rules.
  • Test log export formats for authorities and notified bodies.
  • Review logging after UI, model, or data-pipeline changes.

Map obligations across articles—start the free assessment.

Start Free Assessment

Related annexes

  • Annex III — High-risk AI system areas
  • Annex IV — Technical documentation

Frequently asked questions

Do logs need to store raw inputs containing personal data?

Article 12 mandates capability and minimum fields where applicable; GDPR minimisation, purpose limitation, and security still govern whether raw versus derived logging is lawful—coordinate with counsel.

Are cloud provider logs enough?

Infrastructure logs rarely satisfy Article 12 alone. You usually need application-level events tied to model version, policy decisions, and human verification steps where the article requires them.

How long must automatically generated logs be retained under Article 12?

Article 12(1) requires that high-risk AI systems allow the automatic recording of events (logs) throughout their lifetime. Providers must retain logs for the period appropriate to the intended purpose, and deployers under their control must keep logs for at least six months unless stricter national or Union law applies (Article 26(6)).