Responsible AI Platform
📋 Impact Assessments

DPIA vs FRIA

Which impact assessment do you need for AI?

DPIA under GDPR, FRIA under the AI Act - two different but related assessments. Discover when you need which, what the key differences are, and how to combine them.

AVG
DPIA
AI Act
FRIA
Aug 2026
FRIA mandatory
~10 min readLast updated: February 2026
📋

Why Two Different Impact Assessments?

GDPR and AI Act complement each other

In 2016, GDPR introduced the Data Protection Impact Assessment (DPIA) for privacy risks. With the EU AI Regulation (August 2024), the Fundamental Rights Impact Assessment (FRIA) was added, specifically for high-risk AI systems. This dual obligation arises because AI systems can pose broader risks than data protection alone. While a DPIA focuses on privacy-related risks, a FRIA looks at the full spectrum of fundamental rights: human dignity, non-discrimination, freedom of expression, access to justice and more.

🔒

DPIA

Privacy & data protection

⚖️

FRIA

All fundamental rights

🤖

AI-specific

FRIA for high-risk AI

🔗

Complementary

Can be combined

🔒

DPIA: Data Protection Central

Article 35 GDPR - When and how

Under Article 35 of GDPR, you must perform a DPIA when processing "is likely to result in a high risk to the rights and freedoms of natural persons." This applies when you: systematically and extensively evaluate personal aspects of people, do this based on automated processing including profiling, and base decisions on this that affect people. A DPIA must be performed before the start of processing activities - during the planning phase, not afterwards.

📅

Timing

Before processing starts

🎯

Focus

Personal data

📝

Reporting

Internal (normally)

👤

Responsible

Data controller

⚖️

FRIA: Broader Fundamental Rights Focus

Article 27 AI Regulation - For high-risk AI

The Fundamental Rights Impact Assessment (FRIA) is established in Article 27 of the EU AI Regulation. Unlike the DPIA, the FRIA takes a human-centered approach by examining all relevant fundamental rights: human dignity, equality before the law, non-discrimination, cultural diversity, and the right to effective remedy. The FRIA is mandatory for: (1) all public bodies and private entities providing services of public interest (education, healthcare, social services), and (2) all organizations using AI for creditworthiness assessments or insurance risk assessment.

🏛️

Public bodies

Always FRIA mandatory

🏦

Financial sector

Credit & insurance

📊

Reporting

Mandatory to regulator

📅

Deadline

From August 2026

Related articles

🔄

The 5 Key Differences

DPIA vs FRIA comparison table

1. FOCUS: DPIA focuses on data protection and privacy, FRIA on all fundamental rights. 2. DATA TYPE: DPIA only personal data, FRIA also non-personal data. 3. SCOPE: DPIA for all high-risk data processing, FRIA specifically for high-risk AI systems. 4. WHO IS OBLIGATED: DPIA for data controllers, FRIA for certain categories of AI users (deployers). 5. REPORTING: DPIA internal (except for prior consultation), FRIA mandatory notification to supervisor.

🎯

Focus

Privacy vs all rights

📊

Data

Personal data vs all data

👥

Who

Controller vs deployer

📝

Reporting

Internal vs external mandatory

🔗

Can DPIA and FRIA be Combined?

Article 27(4) AI Regulation acknowledges the overlap

Yes! The AI Regulation acknowledges the overlap between both assessments. Article 27(4) states that a FRIA can complement an existing DPIA when both are required. You can choose: (1) Two separate assessments - separate DPIA and FRIA documents, or (2) Integrated assessment - one combined document that meets both sets of requirements. For successful integration, the document must cover all DPIA requirements from Article 35 GDPR, contain all FRIA elements from Article 27 AI Regulation, and address the broader scope of fundamental rights. Practical tip: Start with your existing DPIA template and extend it with FRIA elements.

Yes, possible

Article 27(4) AI Regulation

1️⃣

Option 1

Two separate documents

2️⃣

Option 2

One integrated document

💡

Tip

Extend DPIA template

Related articles

🔄

Digital Omnibus Impact on Impact Assessments

How the simplification proposal affects DPIA and FRIA

The Digital Omnibus on AI proposal (November 19, 2025) has indirect consequences for impact assessments. High-risk AI obligations, including the FRIA requirement for deployers, are deferred to 6 or 12 months after harmonized standards become available (ultimately December 2027 or August 2028). This gives organizations more time to set up their FRIA processes. Additionally, the proposal expands the possibility to use sensitive personal data for bias detection in all AI systems (not just high-risk), with a lowered threshold from "strictly necessary" to "necessary." This has direct implications for DPIAs: any processing of sensitive personal data requires a thorough privacy assessment. The EDPB and EDPS express concerns in their Joint Opinion 1/2026 about the deferral and advise maintaining the original timeline for certain obligations such as transparency requirements. They also emphasize that DPAs (Data Protection Authorities) must maintain a central role in supervising personal data processing in AI contexts.

FRIA deferral

Deadline may shift to Dec 2027 or Aug 2028.

📊

Bias detection expanded

Sensitive data for bias detection in all AI, not just high-risk.

🔒

DPIA impact

More bias detection = more DPIAs needed for sensitive data.

🛡️

EDPB/EDPS warning

Minimize delays, maintain DPA role in AI supervision.

Frequently Asked Questions

Answers to the most common questions about the EU AI Act

Ready to get started?

Discover how we can help your organization with EU AI Act compliance.

500+
Professionals trained
50+
Organizations helped