GDPR Intersections
The GDPR and the EU AI Act apply simultaneously wherever AI systems process personal data — which is the majority of high-risk AI systems. Understanding the overlaps, complementarities, and tensions between the two regulations is essential for any AI compliance programme.
The key principle: cumulative compliance
The EU AI Act Recital 9 confirms that the AI Act does not supersede or replace the GDPR. Both apply simultaneously. Meeting AI Act obligations does not satisfy GDPR obligations, and vice versa. You need a compliance programme that addresses both, identifies overlapping requirements (where you can create efficiencies), and addresses the gaps that each regulation leaves.
Key GDPR–AI Act intersections
| GDPR provision | AI Act provision | How they interact |
|---|---|---|
| Art. 5 — Principles | Art. 10 — Data governance | Lawful basis, purpose limitation, and data minimisation must apply to all training and inference data |
| Art. 13–14 — Transparency | Art. 13 — Instructions for use | AI systems must disclose their existence and logic to both deployers (AI Act) and data subjects (GDPR) |
| Art. 22 — Automated decisions | Art. 14 — Human oversight | Purely automated decisions with significant effect require human review under both instruments |
| Art. 35 — DPIA | Art. 9 — Risk management | High-risk AI systems processing personal data will almost always require a DPIA in addition to an AI Act risk assessment |
| Art. 25 — Data protection by design | Art. 9 + Art. 15 | Privacy by design aligns with AI Act requirements for robustness and security. Both require risk assessment in design phase |
GDPR Art. 22
Automated Individual Decision-Making
Plain English
GDPR Article 22 is the most directly relevant GDPR provision for AI systems. It gives people the right to not be subject to purely automated decisions that significantly affect them — unless one of three exceptions applies (contract necessity, legal authorisation, or explicit consent). When an exception applies, you must still provide meaningful information about the logic involved, the significance, and the likely consequences. Data subjects also have the right to obtain human intervention, express their point of view, and contest the decision. This directly intersects with the EU AI Act's human oversight requirements in Article 14.
Official Text (EUR-Lex)
1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. 2. Paragraph 1 shall not apply if the decision: (a) is necessary for entering into, or performance of, a contract between the data subject and a data controller; (b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or (c) is based on the data subject's explicit consent.
Key obligations
- 1Identify all automated decision-making processes that produce legal or similarly significant effects
- 2Determine which GDPR Art. 22 exception applies for each automated decision
- 3Provide clear information about automated decision logic under GDPR Art. 13/14
- 4Implement human review mechanisms — which also satisfy AI Act Art. 14 human oversight
- 5Enable data subjects to contest automated decisions
- 6Document your Art. 22 compliance position for each AI system
Tools for this article
Source
Official text from EUR-Lex — Regulation (EU) 2024/1689 (EU AI Act). This text is in the public domain.
GDPR Art. 35
Data Protection Impact Assessment (DPIA)
Plain English
Almost every high-risk AI system that processes personal data will require a DPIA under GDPR. The DPIA must be conducted before deployment, must assess the necessity and proportionality of the processing, the risks to data subjects, and the measures to address those risks. DPIAs can and should be integrated with the AI Act's risk management system under Article 9 — the two documents overlap substantially. Your DPA (Data Protection Authority) may have published lists of processing activities that always require a DPIA — many of these will be AI-related.
Official Text (EUR-Lex)
1. Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. 3. A data protection impact assessment referred to in paragraph 1 shall in particular be required in the case of: (a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person; (b) processing on a large scale of special categories of data referred to in Article 9(1)...
Key obligations
- 1Conduct a DPIA before deploying any high-risk AI system that processes personal data
- 2Integrate the DPIA with your AI Act Art. 9 risk management documentation
- 3Consult your Data Protection Officer (DPO) during the DPIA process
- 4Consult the supervisory authority prior to processing where DPIA shows high residual risk (Art. 36)
- 5Review and update the DPIA when there are changes to the AI system or its context
- 6Document the DPIA and retain it for DPA inspections
Tools for this article
Source
Official text from EUR-Lex — Regulation (EU) 2024/1689 (EU AI Act). This text is in the public domain.
GDPR Art. 5 + AI Act Art. 10
Data Principles and AI Training Data Governance
Plain English
The GDPR's data processing principles apply to all personal data used to train, validate, and test AI systems. You need a lawful basis to use personal data for training — which in practice is difficult to establish for large-scale data scraping. Purpose limitation means you can't freely reuse personal data collected for one purpose for AI training. The AI Act's Art. 10 data governance requirements (bias examination, data quality checks, gap identification) go beyond GDPR but align with it. When building your data governance framework under Art. 10, design it to simultaneously satisfy GDPR Art. 5.
Official Text (EUR-Lex)
GDPR Art. 5 — Principles relating to processing of personal data: (a) processed lawfully, fairly and in a transparent manner ('lawfulness, fairness and transparency'); (b) collected for specified, explicit and legitimate purposes ('purpose limitation'); (c) adequate, relevant and limited to what is necessary ('data minimisation'); (d) accurate and, where necessary, kept up to date ('accuracy'); (e) kept in a form which permits identification no longer than necessary ('storage limitation'); (f) processed in a manner ensuring appropriate security ('integrity and confidentiality'). EU AI Act Art. 10 — Data governance: training, validation and testing data shall be subject to appropriate data governance and management practices.
Key obligations
- 1Establish a lawful basis for each category of personal data used in AI training
- 2Conduct a purpose limitation analysis — is AI training compatible with original data collection purpose?
- 3Apply data minimisation — train on the minimum necessary personal data
- 4Implement data quality and accuracy checks on training datasets
- 5Document your data governance approach to satisfy both GDPR Art. 5 and AI Act Art. 10
- 6Consider synthetic data or anonymisation techniques to reduce GDPR exposure
- 7Establish data retention schedules for training datasets
Tools for this article
Related articles
Source
Official text from EUR-Lex — Regulation (EU) 2024/1689 (EU AI Act). This text is in the public domain.
GDPR roles vs AI Act roles
AI system provider
AI Act role
Often the GDPR data controller for training data. May be a processor for inference data depending on deployment model.
AI system deployer
AI Act role
Often the GDPR data controller for the end users whose data the AI processes during operation. Responsible for Art. 22 compliance.
Both / Integrated
Most common scenario
Many organisations are both provider and deployer. GDPR obligations stack on top of AI Act obligations at each layer.
Build your compliance checklist
Generate a checklist that maps GDPR and EU AI Act obligations to your specific AI system type.
Build compliance checklist →