Chapter XIII — Final ProvisionsArticle 102

Article 102: Amendment to Regulation (EC) No 300/2008

Applies from 2 Aug 20267 min readEUR-Lex verified Apr 2026

Article 102 amends Regulation (EC) No 300/2008 (common rules in the field of civil aviation security) to insert a reference to the EU AI Act. Where AI systems are used in aviation security equipment — such as AI-enhanced baggage scanners, automated threat-detection systems, or biometric passenger screening — the AI Act requirements apply in addition to the existing aviation security rules. The amendment ensures that conformity assessment under the aviation security regime takes AI Act obligations into account, preventing a regulatory gap for AI-powered screening technologies. Always verify on EUR-Lex.

Who does this apply to?

  • -Aviation security equipment manufacturers that embed AI in screening, detection, or monitoring systems
  • -National civil aviation security authorities overseeing equipment approval and deployment
  • -Airport operators deploying AI-based security screening equipment (baggage scanners, access control, biometric gates)

Scenarios

An equipment manufacturer develops an AI-enhanced cabin baggage scanner that uses deep learning to detect prohibited items. The scanner is CE-marked under the aviation security equipment regime and is deployed at a major EU airport.

Under Article 102, the manufacturer must ensure that the AI component also complies with the AI Act — including high-risk classification under Annex I (if it falls under safety-component product legislation), conformity assessment, risk management (Article 9), data governance (Article 10), and technical documentation (Article 11). The existing aviation security approval does not exempt the AI layer from AI Act scrutiny.
Ref. Art. 102, Art. 8, Annex I

An airport deploys an AI-powered biometric boarding system that uses facial recognition to verify passenger identity at security checkpoints, replacing manual document checks.

The system falls under both the aviation security regime (Regulation 300/2008) and the AI Act. As a biometric identification system used in a security context, it may be classified as high-risk under Annex III. The airport (as deployer) must comply with deployer obligations, while the system provider must ensure conformity with both regulatory frameworks simultaneously.
Ref. Art. 102, Art. 26, Annex III

What Article 102 does (in plain terms)

Article 102 is one of a series of amendment articles in Chapter XIII that integrate the AI Act into existing EU sector legislation. Its specific target is Regulation (EC) No 300/2008, which establishes common rules on civil aviation security — covering airport security, in-flight security, cargo screening, and the equipment used for these purposes.

The amendment inserts a reference to Regulation (EU) 2024/1689 (the AI Act) so that where AI systems are embedded in aviation security equipment or processes, conformity assessment and ongoing compliance must account for both regimes. This follows the AI Act's general approach under Article 8: when an AI system is a safety component of a product covered by Union harmonisation legislation listed in Annex I, the AI Act obligations are folded into the product's existing regulatory framework rather than creating a parallel, standalone compliance track.

In practice, this means that aviation security equipment manufacturers cannot argue that aviation-specific certification alone satisfies AI requirements — the AI Act's risk management, data governance, transparency, and human oversight obligations apply on top of the existing aviation security baseline.

How Article 102 connects to the rest of the Act

  • Article 8Compliance of high-risk AI systems with existing sector legislation: the master provision that explains how AI Act obligations integrate with product-level Union harmonisation legislation. Article 102 is the specific amendment that brings Regulation 300/2008 into this framework.
  • Annex IUnion harmonisation legislation (Section A): lists the sector legislation whose conformity assessment procedures apply to high-risk AI systems. After the Article 102 amendment, aviation security equipment containing AI falls under this integrated regime.
  • Article 9Risk management system: AI-powered screening equipment must implement continuous risk management covering the specific risks of AI misclassification (e.g., false negatives on threat detection).
  • Article 43Conformity assessment: determines which conformity assessment route applies; for Annex I products, the existing sectoral assessment procedure applies with AI Act requirements integrated.
  • Article 113Entry into force and application dates: confirms the staged timeline, including when Article 102 becomes operationally relevant.

Practical guidance for aviation security stakeholders

For equipment manufacturers: - Audit every AI component in your aviation security product portfolio. If a screening device, detection algorithm, or monitoring system uses AI, it must be assessed against AI Act requirements. - Update technical documentation to include AI-specific elements: training data descriptions, performance metrics, bias analysis, and human oversight mechanisms. - Coordinate with your notified body — they will need to assess AI Act compliance as part of the aviation security equipment certification process.

For airport operators (deployers): - Review procurement specifications for AI-based security equipment to require AI Act compliance evidence from suppliers. - Implement deployer obligations: ensure human oversight at security checkpoints where AI-assisted decisions are made, maintain usage logs, and monitor system performance post-deployment. - Train security personnel on the AI system's capabilities and limitations — they must understand when to override automated decisions.

For national aviation authorities: - Update inspection and audit frameworks to include AI Act compliance checks alongside existing aviation security assessments. - Coordinate with national AI market surveillance authorities to avoid duplicative or conflicting enforcement actions.

Compliance checklist

  • Identify all AI systems embedded in aviation security equipment covered by Regulation (EC) No 300/2008.
  • Classify each AI system under the AI Act risk framework — many will qualify as high-risk under Annex I (safety component of regulated product) or Annex III (biometric identification).
  • Update conformity assessment documentation to address AI Act requirements (risk management, data governance, technical documentation, human oversight) alongside aviation security requirements.
  • Ensure the notified body conducting aviation security equipment certification is prepared to assess AI Act compliance elements.
  • Implement deployer-side obligations: human oversight protocols at screening points, post-market monitoring, and performance logging.
  • Align implementation timelines with [**Article 113**](/en/ai-act-guide/article-113) staged dates and verify on [**EUR-Lex**](https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=OJ:L_202401689#article-102).

Check if your aviation AI system needs compliance action — start the free assessment.

Start Free Assessment

Frequently asked questions

Does Article 102 create entirely new requirements for aviation security equipment?

No. Article 102 does not create standalone new rules — it amends Regulation (EC) No 300/2008 to cross-reference the AI Act. The substantive AI-related obligations (risk management, data governance, transparency, human oversight) come from the AI Act itself. Article 102 ensures those obligations are integrated into the aviation security regulatory framework rather than running in parallel.

If my aviation security equipment was already certified before the AI Act applies, do I need to re-certify?

Existing certifications are not automatically invalidated, but when equipment undergoes significant modification — or when new AI-powered equipment enters the market after the application date — AI Act compliance must be demonstrated. Check the transitional provisions under Article 113 and Article 111 for your specific situation.

Which body enforces AI Act compliance for aviation security equipment?

National market surveillance authorities enforce AI Act compliance for high-risk AI systems at the application level. However, the aviation security equipment certification process (via notified bodies) should incorporate AI Act requirements. Coordination between aviation security authorities and AI market surveillance authorities is expected.