Responsible AI Platform
🏦 DORA × AI Act

DORA & AI Act

Digital resilience and artificial intelligence in the financial sector

Zahed AshkaraUpdated: June 2026~13 min read

Why banks and insurers must tackle DORA and the AI Act together

Financial institutions — banks, insurers, investment firms, payment service providers — have operated under strict supervisory regimes for years. Yet the combination of DORA and the AI Act introduces a new compliance challenge that many compliance teams underestimate: the two regulations impose overlapping requirements on exactly the same systems.

DORA (Regulation (EU) 2022/2554) governs the digital operational resilience of financial entities and has been fully in force since 17 January 2025. The AI Act governs the development and use of AI systems and applies in phases. Where they intersect is no coincidence: AI systems making financial decisions — credit scoring, fraud detection, trading algorithms — are simultaneously ICT systems under DORA and high-risk AI systems under the AI Act.

Organisations that treat these as two separate tracks run double the risk: compliance with one regulation without regard for the other creates gaps. The most efficient approach is an integrated framework that addresses both legal requirements simultaneously.

What is DORA?

The Digital Operational Resilience Act (DORA) is an EU regulation specifically designed for the financial sector. It requires banks, insurers, investment firms, central counterparties, payment institutions and other financial entities to establish their ICT risk management in a uniform and demonstrable way. DORA is not a directive that allows national transposition — it applies directly in all EU member states.

DORA rests on five pillars: ICT risk management (internal governance and controls), incident reporting (mandatory notification of major ICT-related incidents), digital resilience testing (penetration testing and threat-led testing for systemically important institutions), management of third-party risks (due diligence and contractual requirements for outsourcing), and information sharing on cyber threats among financial institutions.

DORA applies to more than 22,000 financial entities in the EU. National supervisors — DNB and AFM in the Netherlands — are responsible for supervision. The largest and most systemically important institutions face additional requirements, including mandatory Threat-Led Penetration Tests (TLPT) every three years.

The 5 intersections of DORA and AI Act

Below we describe the five concrete points where DORA and the AI Act converge. For each intersection we explain which obligations apply, which articles are relevant, and what this means concretely for the compliance approach of financial institutions.

01

ICT risk management and AI risk management

DORA Art. 6-16 × AI Act Art. 9

DORA requires financial entities to establish a comprehensive ICT risk management framework: policies, procedures, protocols and tools to identify, classify and manage ICT risk. Article 6 establishes that the management body bears ultimate responsibility. Article 8 requires continuous monitoring of ICT vulnerabilities. Article 16 requires smaller entities to maintain a simplified but demonstrable risk management framework.

AI Act Article 9 requires providers of high-risk AI systems to maintain a risk management process throughout the entire lifecycle of the system. This risk management covers identification, analysis and mitigation of risks to health, safety and fundamental rights. For financial institutions deploying AI systems for credit, fraud detection or asset management, both frameworks apply to the same system. The risk management process of Art. 9 AI Act and the ICT risk management framework of DORA must be aligned — or better yet, integrated into a single coherent framework.

In practice: In practice: document each AI system used for financial decision-making both as an ICT asset (DORA risk register) and as a high-risk AI system (AI Act technical file). Use the same risk management process as a foundation for both, and add framework-specific elements where required.

02

Incident reporting: dual notification obligation

DORA Art. 19 × AI Act Art. 62

DORA Article 19 requires financial entities to classify and report major ICT-related incidents. Classification is based on criteria such as duration of the disruption, number of affected clients, financial impact and reputational damage. Major incidents must be reported to the competent supervisor — DNB in the Netherlands for banks and insurers — with an initial notification within 4 hours, an intermediate report within 72 hours, and a final report within one month.

AI Act Article 62 introduces a comparable but broader notification obligation: providers of high-risk AI systems must report serious incidents — where the AI system poses a risk to life, safety or fundamental rights — to the market supervisory authority. For financial AI systems classified as high-risk, an AI incident can simultaneously be an ICT incident. Consider a defective fraud detection system blocking legitimate customer payments, or a credit scoring model making incorrect decisions due to a data anomaly. Such incidents trigger notification obligations under both regulations simultaneously, with different time windows and different recipients.

In practice: In practice: explicitly extend the DORA incident management process with an AI check. For every ICT incident involving an AI system, compliance must assess whether it is also a serious incident under Art. 62 AI Act. Develop combined notification templates and document which supervisor receives which notification.

03

Third parties: AI providers as critical ICT suppliers

DORA Art. 28-30 × AI Act provider obligations

DORA pays particular attention to outsourcing and the use of third-party ICT service providers. Article 28 requires financial entities to maintain a strategy for managing ICT third-party risks. Article 29 sets requirements for the due diligence process. Article 30 contains an extensive list of mandatory contract provisions: the right to audit access, processing locations, service levels, data availability, exit strategies and — crucially — availability of information on the provider's ICT security.

When a financial institution obtains an AI system from an external provider — a cloud-based credit scoring model, a SaaS fraud detection platform, an AI-driven asset management module — that provider is simultaneously an ICT third party under DORA and a provider of high-risk AI under the AI Act. The AI Act imposes obligations on providers regarding technical documentation, EU conformity assessment, CE marking and post-market monitoring. Financial institutions acting as deployers must verify that their AI suppliers comply with these AI Act requirements — in addition to DORA contract requirements. This means in practice: extending the DORA due diligence checklist for ICT suppliers with AI Act-specific questions about the technical file, declaration of conformity and incident reporting.

In practice: In practice: add an AI Act clause to all new and renewing contracts with AI suppliers. Require evidence of compliance with Chapter III of the AI Act (technical file, EU declaration of conformity, registration in the EU database). Combine this check with DORA audit rights so you conduct a single integrated vendor assessment.

04

Testing and resilience

DORA TLPT × AI Act Art. 9 testing requirements

DORA prescribes extensive testing programmes. Article 24 requires financial entities to conduct annual basic resilience tests (vulnerability assessments, open-source-based penetration tests). Article 26 requires the most systemically important institutions to conduct Threat-Led Penetration Tests (TLPT) every three years. TLPT are standardised, supervisor-validated cyber attack simulations on live production systems, executed under the TIBER-EU framework.

AI Act Article 9 requires providers of high-risk AI systems to conduct structured testing throughout the entire development and deployment lifecycle: testing to confirm the system functions as designed, that risks have been adequately mitigated, and that the system performs consistently for all relevant user populations. This includes testing for unintended outcomes, systematic errors and bias. For financial AI systems — credit scoring models, fraud detection algorithms — both DORA tests (technical resilience against attacks) and AI Act tests (functional reliability and fairness) are mandatory on the same system. They do not address the same risks, but both test programmes should be coordinated to leverage overlap in test environments, test data and test timing.

In practice: In practice: schedule DORA resilience tests and AI Act validation tests in the same test window. Use shared test environments where possible. Ensure the test programme for AI systems is explicitly included in DORA test programme reporting to supervisors.

05

Governance and board responsibility

DORA Art. 5 × AI Act Art. 26

DORA Article 5 places ultimate responsibility for ICT risk management explicitly with the management body of the financial entity. The board of directors or supervisory board must approve the ICT risk management framework, allocate sufficient resources, define specific roles, and foster a culture of digital resilience. This is not a delegatable responsibility: supervisors look at board level in the event of an incident.

AI Act Article 26 defines the obligations of deployers — the organisations that put high-risk AI systems into use. Deployers must organise human oversight, assess the suitability of the system for the intended application, monitor the system for anomalies, and — for public bodies — conduct a Fundamental Rights Impact Assessment (FRIA). The governance requirements of both regulations target the same board level. In practice, this means financial institutions must integrate responsibility for DORA compliance and AI Act compliance at board level. A separate DORA governance committee without an AI Act mandate, or vice versa, creates blind spots.

In practice: In practice: extend the DORA governance framework with explicit AI Act responsibilities. Assign a management-level system owner for each high-risk AI system whose mandate includes both DORA ICT risk requirements and AI Act deployer obligations. Report to the board on both frameworks in a single integrated overview.

What do you need to double-regulate?

Below is a practical checklist of measures that both DORA and the AI Act require from financial institutions. These are the points where an integrated approach delivers the most value.

Risk registration for AI systems as ICT asset and high-risk AI

Art. 6-8 DORA (ICT risk management framework and register)⚖️ Art. 9 AI Act (risk management for high-risk AI throughout lifecycle)

Technical documentation for every AI application in financial decision-making

Art. 8 DORA (registration of ICT assets and dependencies)⚖️ Art. 11 + 18 AI Act (technical file + registration in EU database)

Combined notification procedure for ICT incidents and AI incidents

Art. 19 DORA (notification of major ICT-related incidents)⚖️ Art. 62 AI Act (notification of serious incidents for high-risk AI)

Extended due diligence for AI suppliers as critical ICT third parties

Art. 28-30 DORA (strategy, due diligence and contract requirements for third parties)⚖️ Art. 26 AI Act (deployer obligations regarding providers)

Coordinated test programme: DORA resilience tests and AI validation tests

Art. 24-26 DORA (basic resilience tests and TLPT)⚖️ Art. 9(7) AI Act (testing before market introduction)

Human oversight for AI systems influencing financial decisions

Art. 5 DORA (board responsibility for ICT management)⚖️ Art. 14 AI Act (human oversight for high-risk systems)

Integrated governance framework with board reporting on both regulations

Art. 5 DORA (ultimate responsibility of management body)⚖️ Art. 26 AI Act (deployer obligations and organisational measures)

AI Act contractual clauses in all AI supplier contracts

Art. 30 DORA (mandatory contract provisions for ICT third parties)⚖️ Art. 26 + 28 AI Act (contractual arrangements between provider and deployer)

How resilient is your institution on both fronts?

The AI Readiness Score tests your organisation specifically on the intersections between the AI Act and DORA. You get a score per theme and concrete recommendations for the financial sector.