Chapter VI — Measures in Support of InnovationArticle 59

Article 59: Further Processing of Personal Data for Developing Certain AI Systems in the Public Interest in the AI Regulatory Sandbox

Applies from 2 Aug 20266 min readEUR-Lex verified Apr 2026

Article 59 creates a limited legal basis for the further processing of personal data within AI regulatory sandboxes, but only for developing AI systems that serve a substantial public interest — such as public safety, environmental protection, or public health. This is not a general GDPR exemption. Strict conditions apply: the data must be necessary and not replaceable by synthetic or anonymised data, adequate safeguards (pseudonymisation, encryption) must be in place, data must be deleted when sandbox participation ends, and processing logs must be maintained. Data protection authority approval is required.

Who does this apply to?

  • -Sandbox participants who need to process personal data for developing AI systems in the public interest and cannot achieve the same results with synthetic or anonymised data
  • -Data protection authorities (DPAs) responsible for approving and supervising personal data processing within the sandbox framework
  • -Data protection officers (DPOs) overseeing lawful data use within sandboxes and ensuring GDPR alignment

Scenarios

A public health agency develops an AI system to detect early signs of pandemics from hospital admission records. Synthetic data cannot capture the statistical patterns needed. The agency applies to the sandbox with DPA approval, implements pseudonymisation and encryption, and agrees to delete all personal data upon sandbox exit.

Article 59 permits the further processing because the system serves a substantial public interest, synthetic alternatives are inadequate, safeguards are in place, and DPA approval was obtained.
Ref. Art. 59(1)

A private company developing a commercial recommendation engine seeks to use customers' browsing history in a sandbox, arguing it would improve user experience.

Article 59 does not apply — a commercial recommendation engine does not meet the public interest threshold. The company must rely on standard GDPR legal bases and, where possible, use synthetic or anonymised data.
Ref. Art. 59(1)

What Article 59 does (in plain terms)

Article 59 addresses a specific tension: AI systems often need large volumes of representative data to function well, but GDPR's purpose-limitation principle restricts reusing personal data for new purposes. Article 59 provides a narrow exception within sandboxes.

The exception is available only when all of the following conditions are met:

1. Public interest — the AI system is developed for a purpose of substantial public interest (e.g., public safety, environmental protection, public health, road safety). 2. Necessity — the personal data are necessary for the development and the same result cannot be achieved with synthetic, anonymised, or other non-personal data. 3. Safeguards — effective technical and organisational measures are in place, including at minimum pseudonymisation and encryption where feasible. 4. Data deletion — the personal data and any intermediate outputs are deleted when sandbox participation ends or the personal data reaches its retention period, whichever comes first. 5. Processing logs — the sandbox participant must keep logs of data processing activities for the duration of participation. 6. DPA approval — the relevant data protection authority must approve the processing before it begins.

This is not a blanket override of GDPR. All other GDPR obligations (data subject rights, security measures, accountability) continue to apply in full.

The GDPR relationship

Article 59 operates without prejudice to the GDPR and the Law Enforcement Directive (EU) 2016/680. It provides a supplementary legal basis for further processing in a very specific context — it does not create a general \"research exemption\" or reduce GDPR compliance obligations.

Key interactions:

  • Purpose limitation (GDPR Art. 5(1)(b)) — Article 59 allows further processing for a different but defined purpose (developing public-interest AI in a sandbox), provided the conditions are met.
  • Data minimisation (GDPR Art. 5(1)(c)) — the requirement that synthetic or anonymised data must be used if possible reinforces minimisation.
  • Data subject rights — remain applicable throughout sandbox participation.
  • DPA oversight — the approval requirement gives DPAs a gatekeeping role, ensuring case-by-case assessment.

How Article 59 connects to the rest of the Act

  • Article 57 — establishes the sandbox framework within which Article 59 operates.
  • Article 58 — the agreed plan must incorporate Article 59 data-processing arrangements where invoked.
  • Article 60 — real-world testing outside sandboxes has its own consent and data rules; Article 59 is sandbox-specific.
  • Article 10 — data governance requirements for high-risk AI systems (Article 59 processing should align with Article 10 quality standards).
  • Article 113 — application timeline.

Compliance checklist

  • Confirm the AI system qualifies as serving a substantial public interest — commercial or purely private purposes do not meet the Article 59 threshold.
  • Document why synthetic, anonymised, or other non-personal data cannot achieve the same development outcome — this is a prerequisite, not an afterthought.
  • Obtain data protection authority approval before beginning any further processing of personal data in the sandbox.
  • Implement pseudonymisation, encryption, and any other technical and organisational safeguards appropriate to the data sensitivity.
  • Establish a data retention and deletion policy: personal data must be deleted when sandbox participation ends or the retention period expires.
  • Maintain processing logs throughout sandbox participation — these must be available to the DPA and the sandbox supervising authority on request.
  • Ensure GDPR data subject rights remain exercisable throughout the sandbox period — Article 59 does not suspend them.

Processing personal data in a sandbox? Check your AI system's compliance posture first.

Start Free Assessment

Frequently asked questions

Does Article 59 let me use any personal data I want in a sandbox?

No. Article 59 only applies to further processing of personal data for developing AI systems that serve a substantial public interest, and only when synthetic or anonymised data cannot achieve the same result. You must also obtain DPA approval and implement strict safeguards. All other GDPR obligations remain in force.

What qualifies as 'substantial public interest' under Article 59?

The article references areas such as public safety, environmental protection, and public health. The exact scope will depend on the DPA's case-by-case assessment. A purely commercial AI system would not qualify, even if it has incidental public benefits.

Do I need a DPIA in addition to DPA approval under Article 59?

Yes. Article 59 supplements but does not replace GDPR obligations. If the processing is likely to result in a high risk to individuals (which further processing of personal data in an AI sandbox typically does), a DPIA under GDPR Article 35 remains required.