Article 108: Amendment to Regulation (EU) 2018/1139
Article 108 amends Regulation (EU) 2018/1139 — the EASA Basic Regulation on common rules in the field of civil aviation. AI systems in aviation — including autonomous flight systems, air traffic management (ATM) AI, predictive maintenance, pilot assistance, unmanned aircraft systems (UAS/drones), and design organisation AI tools — must comply with the AI Act alongside EASA aviation safety rules. EASA may take AI Act compliance into account during type certification, operational approval, and continuing airworthiness oversight. Always verify on EUR-Lex.
Who does this apply to?
- -Aviation manufacturers (aircraft, engines, propellers, parts) and design organisations using AI in certified products or design tools
- -EASA conducting type certification, supplemental type certification, and continuing airworthiness oversight
- -Air traffic management and air navigation service providers deploying AI-based systems (ATM/ANS)
- -UAS/drone manufacturers and operators with AI-powered autonomous flight, sense-and-avoid, or mission planning systems
Scenarios
An aircraft manufacturer develops an AI-powered flight envelope protection system that uses neural networks to predict aerodynamic limits in real time based on sensor fusion, adapting protection thresholds beyond what traditional static tables provide.
An air navigation service provider (ANSP) deploys an AI-based arrival management system that uses machine learning to optimise aircraft sequencing, spacing, and approach routing at a major European hub airport, reducing delays and fuel burn.
What Article 108 does (in plain terms)
Article 108 integrates the AI Act into Regulation (EU) 2018/1139 — commonly known as the EASA Basic Regulation — which is the cornerstone of EU civil aviation safety law. This Regulation establishes the European Union Aviation Safety Agency (EASA), sets out essential requirements for airworthiness, air operations, aircrew, air traffic management (ATM), and aerodromes, and defines the certification and oversight framework for all civil aviation in the EU.
The amendment means that where AI systems are used in any aspect of aviation covered by the EASA Regulation — from aircraft design and certification to air traffic management, predictive maintenance, UAS/drone operations, and ground handling — the AI Act requirements must be considered as part of EASA's regulatory processes. This follows the Article 8 integration model.
This is one of the most consequential amendments because aviation has some of the most stringent safety requirements of any industry, and AI is being adopted rapidly across the sector. The amendment ensures that EASA's established safety framework — including its risk-based approach, design assurance levels, and continuing airworthiness requirements — is enhanced to address AI-specific risks.
How Article 108 connects to the rest of the Act
- Article 8 — Integration with sector legislation: AI Act requirements are assessed through existing aviation certification and approval processes. Article 108 brings Regulation 2018/1139 into this framework.
- Annex I — Union harmonisation legislation: the EASA Regulation is listed, making aviation AI systems subject to the integrated conformity assessment regime.
- Article 9 — Risk management: aviation AI risk management must meet the highest standards, consistent with the sector's safety culture and existing risk management frameworks (e.g., EASA's management system requirements).
- Article 14 — Human oversight: pilots and air traffic controllers must retain meaningful decision authority over AI-assisted operations — consistent with established crew resource management (CRM) and controller responsibility principles.
- Article 102 — Aviation security amendment: Article 102 covers aviation security equipment under Regulation 300/2008; Article 108 covers aviation safety under the EASA framework. Together they address the full aviation regulatory landscape.
- Article 113 — Entry into force and application dates: confirms the timeline.
Practical guidance for aviation stakeholders
For aircraft and component manufacturers (Design Organisations): - Map AI systems across your product portfolio: autopilot enhancements, flight envelope protection, engine health monitoring, AI-assisted diagnostics, and any ML-based design or simulation tools used in the certification process. - Prepare AI Act compliance evidence as part of your type certificate (TC) or supplemental type certificate (STC) applications. EASA will expect to see AI-specific documentation alongside existing certification data packages. - Monitor EASA's guidance development — EASA has been actively working on AI/ML certification frameworks (e.g., the EASA AI Roadmap and Concept Paper on AI). These will inform how Article 108 is implemented in practice.
For airlines and operators: - Assess AI tools used in operations: dispatch systems, predictive maintenance platforms, pilot tablet applications, and fuel optimisation algorithms. Even if these are not part of the aircraft's type certificate, they may fall under operational approvals or management system requirements. - Ensure that flight crew are trained on AI system capabilities and limitations — this supports both the AI Act's human oversight requirements and existing CRM/training obligations.
For ATM/ANS providers: - Identify AI components in ATM systems: arrival/departure management, conflict detection and resolution, trajectory prediction, and workload management tools. - Implement AI Act requirements within the existing ATM/ANS management system and change management processes, coordinated with EASA and national supervisory authorities.
For UAS/drone operators and manufacturers: - AI is fundamental to most UAS operations (autonomous flight, sense-and-avoid, payload management). Ensure AI Act compliance is integrated into EASA UAS certification (for certified category) or operational authorisation processes.
Compliance checklist
- Identify all AI systems in aviation products and operations covered by Regulation (EU) 2018/1139 — aircraft, engines, ATM/ANS, UAS, aerodromes, and maintenance.
- Classify each AI system under the AI Act risk framework — aviation AI safety components will generally qualify as high-risk under Annex I.
- Integrate AI Act documentation (risk management, data governance, technical documentation, human oversight) into EASA certification data packages (TC, STC, ETSO).
- Ensure risk management for aviation AI (Article 9) is consistent with EASA's existing safety management and risk assessment frameworks.
- Design human oversight so that pilots and controllers retain meaningful authority — consistent with CRM principles and ICAO/EASA requirements for flight crew and controller responsibilities.
- Monitor EASA's AI/ML certification guidance and roadmap for implementation specifics and acceptable means of compliance.
- Verify application dates under [**Article 113**](/en/ai-act-guide/article-113) and the consolidated text on [**EUR-Lex**](https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=OJ:L_202401689#article-108).
Assess your aviation AI compliance exposure — start the free assessment.
Start Free AssessmentRelated Articles
Frequently asked questions
Does Article 108 mean EASA becomes the AI Act enforcement body for aviation?
Article 108 amends the EASA Regulation so that EASA's certification and oversight processes account for AI Act requirements. EASA integrates AI Act compliance into its existing certification framework (type certificates, operational approvals, management systems). However, national market surveillance authorities retain their general AI Act enforcement role. In practice, EASA's certification assessment will be the primary mechanism for ensuring aviation AI meets AI Act standards — similar to how EASA already ensures compliance with other EU requirements through its certification process.
How does the AI Act interact with EASA's existing AI/ML work (AI Roadmap, Concept Papers)?
EASA has been developing AI/ML-specific certification guidance since 2020 (AI Roadmap 1.0 and 2.0, Concept Papers on AI trustworthiness). Article 108 gives EASA a legal basis to require AI Act compliance as part of aviation certification. EASA's existing AI/ML concept papers are expected to evolve into acceptable means of compliance (AMC) or certification specifications (CS) that operationalise the AI Act requirements in the aviation context.
Does this apply to military aviation?
No. Regulation 2018/1139 (EASA Basic Regulation) applies to civil aviation. Military aviation is excluded from both the EASA framework and the AI Act (Article 2(3) excludes AI systems developed or used exclusively for military purposes). Dual-use systems may be subject to the AI Act for their civil aviation use.