Chapter IX, Section 4 — RemediesArticle 85

Article 85: Right to Explanation of Individual Decision-Making

Applies from 2 Aug 20268 min readEUR-Lex verified Apr 2026

Article 85 is one of the most important individual rights provisions in the AI Act. It grants any affected person the right to obtain clear and meaningful explanations of the role of a high-risk AI system in a decision-making procedure and the main elements of the decision taken. This applies when the decision produces legal effects or similarly significantly affects that person in ways relating to health, safety, or fundamental rights. The deployer must provide the explanation within a reasonable time. This right does not apply where existing Union or national law already provides for such a right (e.g., GDPR Article 22).

Who does this apply to?

  • -Affected persons — individuals subject to decisions made with the assistance of a high-risk AI system
  • -Deployers of high-risk AI systems who must provide explanations on request
  • -Data protection officers coordinating between AI Act Article 85 and GDPR Article 22 obligations
  • -Compliance teams designing explanation workflows for high-risk AI deployments

Scenarios

A bank uses a high-risk AI system to reject a consumer's credit application. The applicant asks why they were rejected.

The deployer (bank) must provide the applicant with a clear explanation of the AI system's role in the decision, the main elements of the rejection decision, and the key input data and parameters that influenced the outcome — within a reasonable time.
Ref. Art. 85(1)

A public employment service uses an AI-driven profiling tool to assign jobseekers to support categories. A jobseeker placed in the lowest priority tier requests an explanation.

The deployer must explain how the AI system contributed to the classification, the main factors considered, and the main parameters. If national social law already grants a right to explanation for this decision, Article 85 defers to that existing right.
Ref. Art. 85(1)–(2)

What Article 85 does (in plain terms)

Article 85 creates a standalone right to explanation for individuals affected by decisions made with the assistance of a high-risk AI system. The right is triggered when a decision:

1. Produces legal effects concerning the person (e.g., denial of a benefit, visa rejection), or 2. Similarly significantly affects the person in relation to their health, safety, or fundamental rights

The explanation must cover three elements: - (a) The role of the AI system in the decision-making procedure - (b) The main elements of the decision taken - (c) The main parameters and input data that influenced the decision

The deployer — not the provider — bears the duty to furnish the explanation. It must be delivered in clear and meaningful language, within a reasonable time.

Key carve-out: Article 85 does not apply where Union or Member State law already provides for a right to obtain an explanation of the specific decision. This prevents overlap with, for example, GDPR Article 22 automated decision-making rights or sector-specific procedural rights in administrative law.

How Article 85 connects to the rest of the Act

  • Article 26 — Deployer obligations: Article 85 reinforces the deployer's duty to operate high-risk systems transparently and to inform affected persons.
  • Article 13 — Transparency requirements: the technical documentation and instructions for use that providers must supply underpin the deployer's ability to explain.
  • Article 14 — Human oversight: meaningful explanation presupposes that human reviewers understand how the AI system contributed to a decision.
  • Article 86 — Right to lodge a complaint: if the deployer fails to provide an adequate explanation, the affected person may escalate via a complaint to the market surveillance authority.
  • Article 99 — Penalties: breach of deployer duties (including explanation obligations) can trigger Tier 2 fines (up to EUR 15M or 3% of turnover).
  • Article 113 — Application dates: Article 85 applies from 2 August 2026.
  • GDPR Article 22 — Where automated decision-making involves personal data, the GDPR right may apply in parallel or instead; Article 85 defers where Union or national law already provides an explanation right.

Practical guidance: building an explanation workflow

Article 85 requires deployers to be operationally ready to respond to explanation requests. Consider these implementation steps:

1. Identify triggering decisions — Map every use of a high-risk AI system where the output feeds into a decision producing legal effects or similarly significant impacts. 2. Pre-compute explanations — Where possible, generate explanation-ready outputs at decision time (e.g., feature-importance summaries, decision factors) so responses can be delivered quickly. 3. Define 'reasonable time' — While the Act does not prescribe a fixed deadline, align with analogous timelines (e.g., GDPR's one-month response period) and document your internal SLA. 4. Train frontline staff — The people who receive explanation requests (customer service, case officers) must know how to access and communicate AI-related decision factors. 5. Check the carve-out — For each use case, determine whether Union or national law already grants an explanation right. If so, comply with that existing regime and document why Article 85 is disapplied.

Official wording: Article 86 (English)

Note: In the published OJ L 2024/1689, the right to explanation is Article 86 (not 85). This guide file uses the slug `article-85` but the verbatim text below is Article 86 of Regulation (EU) 2024/1689.
1. Any affected person subject to a decision which is taken by the deployer on the basis of the output from a high-risk AI system listed in Annex III, with the exception of systems listed under point 2 thereof, and which produces legal effects or similarly significantly affects that person in a way that they consider to have an adverse impact on their health, safety or fundamental rights shall have the right to obtain from the deployer clear and meaningful explanations of the role of the AI system in the decision-making procedure and the main elements of the decision taken.
2. Paragraph 1 shall not apply to the use of AI systems for which exceptions from, or restrictions to, the obligation under that paragraph follow from Union or national law in compliance with Union law.
3. This Article shall apply only to the extent that the right referred to in paragraph 1 is not otherwise provided for under Union law.

Recitals and legislative context

Recitals 171–173 of the AI Act contextualise Article 85. They emphasise that affected persons should be able to understand the AI-assisted decision, and that the explanation should be provided in a way that is sufficiently clear for a reasonable person to understand. The recitals also clarify the relationship with GDPR Article 22 and stress that Article 85 is without prejudice to other Union or national law granting stronger explanation rights.

Use the official preamble on EUR-Lex to read the recitals in full.

Compliance checklist

  • Inventory all high-risk AI system deployments where output feeds into decisions with legal or similarly significant effects on individuals.
  • Design explanation templates covering the three required elements: (a) role of the AI system, (b) main elements of the decision, (c) main parameters and input data.
  • Establish an internal SLA for responding to explanation requests within a 'reasonable time' (consider benchmarking against GDPR's one-month standard).
  • Train customer-facing and case-handling staff on how to fulfil Article 85 explanation requests.
  • For each use case, assess whether GDPR Article 22 or other Union/national law already provides an explanation right — document the analysis and disapply Article 85 where appropriate.
  • Ensure providers supply sufficient technical documentation under Article 13 to enable deployers to generate meaningful explanations.
  • Log all explanation requests and responses for audit purposes and potential market surveillance authority inquiries.

Check if your AI deployments need explanation workflows — start the free assessment.

Start Free Assessment

Related annexes

  • annex-iii-high-risk-areas

Frequently asked questions

How does Article 85 differ from GDPR Article 22 (automated decision-making)?

Article 85 specifically targets high-risk AI system decisions and requires explanation of the AI system's role, the main decision elements, and key parameters. GDPR Article 22 addresses solely automated decisions involving personal data. Where GDPR Article 22 already provides an explanation right for the same decision, Article 85 defers to it — but Article 85 is broader in that it covers AI-assisted (not only fully automated) decisions and extends to health, safety, and fundamental rights impacts beyond personal data.

Who must provide the explanation — the provider or the deployer?

The deployer (the organisation that uses the AI system to make or support decisions). The provider must supply sufficient technical documentation and transparency information under Articles 13 and 26 to enable the deployer to fulfil this duty.

What counts as a 'reasonable time' for providing the explanation?

The AI Act does not define a specific deadline. In practice, align with comparable frameworks — GDPR uses one month as a benchmark. The key test is whether the delay is justified given the complexity of the explanation and whether the affected person can still meaningfully use the information.