Article 30: Requirements Relating to Notified Bodies
Article 30 is a critical article that lists the exhaustive requirements a conformity assessment body must satisfy to be designated as a notified body under the AI Act. Requirements span legal personality, independence, technical competence (including AI-specific expertise in data, computing, fundamental rights, health and safety), documented procedures, liability insurance, and the ability to fulfil tasks within defined timeframes. Staff must maintain confidentiality. The notified body must also participate in standardisation activities and relevant EU coordination groups. Always verify on EUR-Lex.
Who does this apply to?
- -Conformity assessment bodies seeking notified body status under the AI Act
- -Notified bodies maintaining ongoing compliance with Article 30 requirements
- -Notifying authorities verifying that bodies meet and continue to meet Article 30 requirements
Scenarios
A conformity assessment body applies for notification but its AI technical team consists of only two engineers with software testing experience and no expertise in machine learning, data governance, or fundamental rights impact.
An established notified body under the Machinery Directive is also seeking notification under the AI Act for AI safety components in industrial robots.
What Article 30 does (plain terms)
Article 30 sets the bar for notified bodies. It is the most detailed article in Chapter IV because it defines every quality, competence, and organisational criterion a body must meet. The requirements include:
(a) Legal personality — The body must be established under the national law of a Member State and possess legal personality.
(b) Independence — The body must be organisationally and functionally independent from the providers whose AI systems it assesses. Senior management and assessment staff must not be involved in the design, development, manufacture, marketing, installation, use, or maintenance of the AI systems they assess. This extends to any parent company, subsidiary, or affiliated entity.
(c) Competence — The body must demonstrate the ability to carry out conformity assessment activities competently, with AI-specific expertise appropriate to the tasks. This includes knowledge of AI technologies, data and data computing, fundamental rights, health and safety risks, and the existing standards and regulatory framework.
(d) Staffing — Sufficient personnel with the requisite technical knowledge and experience, including understanding of: - AI technologies and how they work - Data and data governance - Computing infrastructure - Fundamental rights risks - Health and safety risks - Existing harmonised standards and common specifications
(e) Documented procedures — Transparent, reproducible assessment procedures that enable the body to carry out third-party conformity assessment appropriately.
(f) Liability insurance — Appropriate professional indemnity insurance for conformity assessment activities.
(g) Timeliness — The ability to fulfil tasks and provide results within defined timeframes.
Staff must maintain confidentiality regarding information obtained during assessment. The notified body must also participate in relevant standardisation activities and coordination groups. Verify every point on EUR-Lex Article 30.
Independence and conflicts of interest
The independence requirement in Article 30 is designed to prevent any situation where a notified body's commercial interests or relationships could compromise its assessment objectivity. Key points:
- The body's top-level management and assessment staff must be free from commercial, financial, or other pressures that could influence their judgement.
- The body must not have been involved in the design, development, or marketing of the AI system being assessed, either directly or through related entities.
- The body must ensure that the activities of its subsidiaries or subcontractors (Article 32) do not impair its independence.
- Remuneration of assessment staff must not depend on the number of assessments performed or their results.
AI-specific competence requirements
Unlike existing New Legislative Framework notified body requirements, Article 30 explicitly demands AI-domain expertise. This reflects the unique nature of AI systems, where traditional product testing (e.g., mechanical stress tests) is insufficient.
Notified bodies must demonstrate competence in: - AI technologies — understanding of machine learning paradigms, neural networks, symbolic AI, and hybrid systems. - Data governance — ability to evaluate training data quality, bias, representativeness, and privacy compliance. - Fundamental rights — knowledge of how AI systems can impact non-discrimination, privacy, freedom of expression, and human dignity. - Cybersecurity — understanding of adversarial attacks, model robustness, and data poisoning risks.
This means existing notified bodies under Annex I product legislation (e.g., MDR, Machinery Regulation) will typically need to augment their teams with AI specialists.
How Article 30 connects to the rest of the Act
- Article 28 — The notifying authority verifies compliance with Article 30.
- Article 29 — The application process for bodies claiming to meet Article 30 requirements.
- Article 31 — Harmonised standards that trigger a presumption of conformity with Article 30.
- Article 32 — Subsidiaries and subcontractors must also meet Article 30 requirements.
- Article 43 — Conformity assessment procedures that notified bodies carry out.
- Annex VI — Internal control conformity assessment procedure.
- Annex VII — Conformity assessment procedure based on assessment of the QMS and technical documentation.
- Article 113 — Application dates and staged entry into force.
Compliance checklist
- Verify legal personality: confirm your conformity assessment body is established under national law with legal personality.
- Audit independence: ensure no organisational, financial, or personnel links to providers whose systems you will assess.
- Assess AI-specific staffing: verify that your team includes experts in AI technologies, data governance, fundamental rights, and cybersecurity.
- Document procedures: ensure all conformity assessment procedures are documented, transparent, and reproducible.
- Obtain liability insurance: confirm appropriate professional indemnity insurance covers AI conformity assessment activities.
- Demonstrate timeliness: establish internal SLAs and capacity planning to fulfil assessments within defined timeframes.
- Participate in standardisation: join relevant CEN/CENELEC working groups and EU coordination groups for AI Act notified bodies.
Assess your AI Act conformity assessment readiness—start the free assessment.
Start Free AssessmentRelated Articles
Related annexes
- Annex VI — Internal control conformity assessment procedure
- Annex VII — Conformity assessment based on QMS and technical documentation
Frequently asked questions
Can existing notified bodies under the MDR or Machinery Regulation automatically become notified bodies under the AI Act?
No. Existing designation under other EU legislation does not automatically extend to the AI Act. The body must separately apply under Article 29 and demonstrate that it meets the AI-specific requirements of Article 30, including AI technology expertise, data governance competence, and fundamental rights knowledge.
What happens if a notified body no longer meets Article 30 requirements after designation?
The notifying authority must restrict, suspend, or withdraw the notification under Article 34. Affected providers must be informed, and their certificates may need to be transferred to another notified body.
Is fundamental rights expertise really required for a technical conformity assessment body?
Yes. Article 30 explicitly includes knowledge of fundamental rights among the required competences. This reflects the AI Act's recognition that high-risk AI systems can impact non-discrimination, privacy, and other rights — the conformity assessment must cover these dimensions.