Search

Two Regulatory Frameworks, One AI System

The EU AI Act and the GDPR are not alternatives. They are concurrent regulatory frameworks that apply simultaneously to the same AI systems wherever those systems process personal data.

For the vast majority of AI systems deployed in commercial contexts, which by definition handle information about individuals in one form or another, compliance with the AI Act does not satisfy GDPR obligations, and GDPR compliance does not substitute for AI Act requirements. Both frameworks must be addressed, and the compliance work for one can only be partially leveraged for the other.

Understanding where the two frameworks overlap, where they diverge, and where they create compounding requirements is essential for organisations building AI governance programmes. The interaction is not merely administrative. It affects system design decisions, organisational accountability structures, documentation requirements, and the procedural steps needed before a system can be deployed. In some cases, the same processing activity triggers both a Data Protection Impact Assessment (DPIA) under the GDPR and a Fundamental Rights Impact Assessment (FRIA) under the AI Act, and the two assessments have different scopes, methodologies, and documentation requirements.

Where the Frameworks Overlap

The most significant area of overlap concerns automated decision-making. The GDPR's Article 22 grants data subjects the right not to be subject to decisions based solely on automated processing that produce legal or similarly significant effects, and requires the controller to provide meaningful information about the logic involved, along with the significance and envisaged consequences of the processing. The AI Act's transparency requirements for high-risk AI systems operate in a parallel track: deployers must inform individuals when they are subject to a high-risk AI system's output and must provide meaningful information about the system's capabilities and limitations.

Both frameworks also require documented risk assessments before certain processing or deployments commence. The GDPR requires a DPIA where processing is likely to result in high risk to the rights and freedoms of individuals, and automated decision-making with significant effects almost always triggers this obligation. The AI Act requires a Fundamental Rights Impact Assessment for deployers of certain high-risk systems and requires providers to conduct a conformity assessment. These assessments cover overlapping but not identical ground, and producing one does not satisfy the other.

Key Divergences

While the frameworks overlap substantially, they have different scopes and different conceptual anchors. The GDPR is triggered by the processing of personal data and applies whenever information relating to identified or identifiable individuals is handled. The AI Act is triggered by the use of an AI system and applies regardless of whether personal data is involved, although in practice most AI systems of commercial interest do process personal data.

The AI Act's obligations fall on providers and deployers as defined by their role in the AI system's lifecycle, while the GDPR's obligations fall on controllers and processors defined by their role in the data processing relationship. These categories do not map neatly onto each other. A GDPR processor (a service provider processing data on behalf of a controller) may be an AI Act provider if it developed the AI system and places it on the market; the controller may be a deployer. Data Processing Agreements under the GDPR must be supplemented or co-drafted with contractual arrangements under the AI Act that address the division of responsibilities between provider and deployer.

Compounding Compliance Obligations

The compounding effect of both frameworks applying simultaneously is most visible in high-risk AI systems that process special category personal data: medical AI, HR AI, financial AI systems in credit or insurance contexts, and recruitment AI. These systems are likely to be high-risk under the AI Act (falling within Annex III categories), likely to involve GDPR-regulated personal data processing, and likely to involve processing of special categories (health, biometric, or financial data) that attracts enhanced GDPR protections including explicit consent or statutory basis requirements, enhanced DPIA obligations, and potential restrictions on automated decision-making.

Organisations in these sectors cannot sequence their compliance work as if one framework precedes the other. The system design must satisfy both sets of requirements from the outset. Technical documentation, logging, and human oversight mechanisms required by the AI Act will often overlap with the security measures, data minimisation requirements, and purpose limitation obligations of the GDPR. Designing for both simultaneously is more efficient than designing for one and retrofitting the other.

Frequently Asked Questions

Does GDPR compliance mean we are compliant with the AI Act?

No. The AI Act imposes obligations that go significantly beyond the GDPR, including conformity assessments, registration in the EU AI database, CE marking, specific technical documentation requirements for AI systems, GPAI model obligations, and prohibited AI practices that the GDPR does not address. GDPR compliance is a necessary but not sufficient condition for organisations operating high-risk AI systems. The two frameworks must be addressed in parallel, not sequentially.

Do we need both a DPIA and a Fundamental Rights Impact Assessment?

Potentially yes, and they serve different purposes. A GDPR DPIA assesses the risks that a processing activity poses to data subjects' rights and freedoms, and is required when processing is likely to result in high risk. Automated decision-making with significant effects almost always triggers this. An AI Act FRIA assesses the impact of deploying a high-risk AI system on fundamental rights and is required for deployers that are bodies governed by public law, and for certain private deployers in credit, insurance, and employment contexts. The FRIA's scope is broader than the DPIA, as it addresses non-data fundamental rights (employment rights, non-discrimination, freedom of expression) as well as data rights.

How should Data Processing Agreements be updated for AI?

DPAs under the GDPR should be reviewed and supplemented to address the AI Act's division of responsibilities between provider and deployer. Key additions include identifying whether the service provider is an AI Act provider and whether the customer is a deployer for each AI system involved, addressing instructions for use and human oversight obligations, allocating responsibility for post-market monitoring and incident reporting, and ensuring that the DPA's data processing descriptions reflect the actual data flows through the AI system. For new AI service agreements, the AI Act obligations should be addressed in the contract alongside the GDPR DPA provisions.

What are the penalties if both GDPR and AI Act violations occur in the same system?

Both sets of penalties apply independently. A high-risk AI system deployed without a conformity assessment can attract an AI Act fine of up to EUR 15 million or 3% of global turnover, while the same system's personal data processing failures can attract GDPR fines of up to EUR 20 million or 4% of global turnover. The penalties are not aggregated under a single maximum; each infringement is addressed independently by the competent authority. National data protection authorities and national market surveillance authorities have overlapping but distinct jurisdiction, and enforcement actions can proceed in parallel.

Bart Lieben
Attorney-at-Law
key takeaways

More related articles

WhatsApp messaging icon for live chat support
Pitch Chatbot
Contact us right away
Pitch Chatbot
Hi there,
How can we help you today?
Start Whatsapp Chat
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
No items found.