Responsible AI Platform
High-risk Sector

AI Act Compliance for Healthcare

Medical diagnostics, treatment support and care allocation — high-risk under the AI Act

Practical guidelines for hospitals, clinics and other healthcare institutions to comply with the EU AI Act.

View the compliance checklist

Why Take Action Now?

The AI Act has major impact on the healthcare sector

August 2025

First obligations for high-risk AI systems come into effect

Medical AI = High-risk

AI for diagnosis and treatment automatically falls under strictest rules

Overlap with MDR

Medical devices with AI must comply with both MDR and AI Act

Physician Remains Responsible

Human oversight requirement demands clear protocols

High-risk AI in Healthcare

These AI applications fall under strict AI Act requirements (Annex III)

Diagnostic AI

AI systems that analyze medical images or support diagnoses — from X-ray analysis to pathology. Often also qualifies as a medical device under MDR.

CT/MRI image analysisDermatological AIPathology screeningECG interpretationRadiology AIOphthalmology screening

Treatment Support

Systems providing treatment advice, medication dosing or therapy selection support. Both clinical validation and AI Act conformity are required.

Clinical Decision SupportMedication dosingTherapy selectionTreatment protocolsInteraction checkersDose optimization

Triage & Care Allocation

AI determining which patients get priority or which care is allocated. Direct impact on access to essential services.

ER triage systemsWaitlist prioritizationBed allocationCare intensity assessmentICU admission predictionAmbulance dispatching

Predictive Models

Systems predicting patient outcomes or assessing risks. Classification depends on degree of autonomy and clinical impact.

Sepsis early warningReadmission riskMortality predictionDeterioration alertsFall risk screeningComplication prediction

Specific Challenges for Healthcare Institutions

The AI Act brings unique compliance questions for the healthcare sector

MDR and AI Act Alignment

Medical devices with AI must comply with both. How to integrate requirements?

Information Security Integration

How does AI Act compliance fit into existing healthcare information security?

Clinical Validation

AI systems must be clinically validated. What evidence is needed?

Physician in the Loop

Human oversight is mandatory. How to ensure physicians make informed decisions?

Vendor Management

Much AI comes from external vendors. What guarantees to request?

Patient Rights & Transparency

Patients have right to explanation. How to communicate about AI use in care?

AI Act Compliance Roadmap

Practical steps for healthcare institutions

1

AI Inventory

2-4 weeks

Map all AI systems. Which systems influence patient care?

2

Classification & MDR Check

2-3 weeks

Determine AI Act classification and potential MDR class per system.

3

Gap Analysis

4-8 weeks

Compare current documentation with AI Act and MDR requirements.

4

Clinical Governance

2-4 months

Implement protocols for human oversight and clinical decision-making.

5

Monitoring & Vigilance

Ongoing

Set up post-market surveillance for continuous AI performance monitoring.

18-month trajectory

Implementation Roadmap

Detailed 6-phase timeline with concrete deliverables for healthcare institutions

1

AI Inventory

Month 1-2
Complete AI system register per departmentMDR classification per systemOwners and responsible parties
2

Risk Classification

Month 2-3
AI Act classification per systemMDR/IVDR class determinationDual compliance justification
3

Gap Analysis

Month 3-5
Per high-risk system: AI Act + MDR gapsClinical validation statusCE marking check
4

Governance & Protocols

Month 5-8
AI Governance CommitteeCMIO role and mandateHuman oversight protocolsPatient information policy
5

Implementation

Month 8-14
Technical documentationConduct FRIAsSet up post-market surveillanceMedical staff training
6

Audit & Certification

Month 14-18
Internal auditNotified body assessmentIGJ readiness checkContinuous monitoring

AI System Inventory Guide

Typical AI systems in healthcare institutions and their likely classification

Note: AI systems in healthcare are often scattered across departments (radiology, lab, clinical). Involve all departments in the inventory — not just IT.

Diagnostic AI

Usually high-risk
CT/MRI analysisPathology screeningRadiology AIOphthalmology screening

Annex III + often MDR medical device — dual compliance required

Clinical Decision Support

Often high-risk
Treatment adviceMedication dosingInteraction checkersTherapy selection

High-risk when it directly influences clinical decisions

Triage & Allocation

Usually high-risk
ER triageWaitlist prioritizationICU admission predictionAmbulance dispatch

Direct impact on access to essential care — Annex III category 5

Predictive Models

Context-dependent
Sepsis early warningReadmission riskDeterioration scoresFall risk prediction

Depends on autonomy and clinical impact — may be high-risk

Administration & Workflow

Usually minimal risk
OR schedulingStaff rosteringAutomated documentationSpeech recognition

Minimal risk unless it directly affects patient care

Patient Communication

Limited risk
Patient portal chatbotsAppointment systemsSymptom checkersHealth education AI

Transparency obligation (Art. 50) — patient must know it is AI

Classification Decision Tree for Healthcare AI

Quickly determine the risk classification of your AI system

Is the AI system a medical device under the MDR?

Yes

High-risk — dual compliance (AI Act + MDR)

No

Go to next question

Does it influence clinical decisions about patients?

Yes

Likely high-risk

No

Go to next question

Is it supportive with physician override (physician always decides)?

Yes

Possibly limited risk

No

Go to next question

Is it purely administrative without impact on patient care?

Yes

Minimal risk

No

Consult an expert for classification

This is a simplified decision tree. Consult your legal team and CMIO for the definitive classification, especially for MDR overlap.

Governance Structure for Healthcare AI

Recommended organizational structure for AI governance in healthcare institutions

Board of Directors
AI Governance Committee (multidisciplinary)
CMIO (Chief Medical Information Officer)
Medical Staff AI Committee
IT / Data Team
Quality & Safety

Involve clinical leadership from the start — AI governance in healthcare cannot succeed without medical expertise.

Key Roles

CMIO

Bridges clinical practice and IT — responsible for medical AI strategy and clinical validation

AI Compliance Officer

Overall monitoring of AI Act compliance and coordination with regulators

Clinical AI Champion

Physician per department monitoring AI use and ensuring human oversight (Art. 14)

Data Protection Officer

Ensures GDPR compliance for sensitive health data and DPIAs

Compliance Checklist for High-risk Healthcare AI

Concrete checkpoints per high-risk AI system in healthcare

AI system registered in EU databaseArt. 49
Risk management system established (including clinical risks)Art. 9
Data governance & data quality ensured (sensitive health data)Art. 10
Technical documentation complete (including clinical validation)Art. 11
Logging & traceability in placeArt. 12
Transparency to physicians and patientsArt. 13
Human oversight established (physician-in-the-loop)Art. 14
Accuracy & robustness clinically testedArt. 15
FRIA conducted as deployerArt. 27
Conformity assessment completed (+ CE marking for MDR)Art. 43
Post-market surveillance & vigilance establishedArt. 72
Patient information about AI use availableArt. 13/86

This checklist applies per high-risk system. For medical devices, MDR/IVDR requirements also apply. Consult your CMIO and legal team.

Common Mistakes in Healthcare

Avoid these pitfalls in AI Act implementation for healthcare institutions

Treating all clinical software as high-risk

Many support tools are not. Careful classification saves millions in unnecessary compliance costs.

Forgetting MDR when focusing on AI Act

Medical AI is often also a medical device. Two separate compliance tracks are needed.

Not involving clinicians in compliance

AI Act compliance in healthcare cannot succeed without physicians, nurses and clinical leadership.

Assuming vendor CE marking covers AI Act

CE marking under MDR is different from AI Act conformity. Both must be ensured separately.

Skipping post-market surveillance

Mandatory for high-risk systems. Clinical AI requires continuous performance and safety monitoring.

Not informing patients about AI use

Duty to inform patients and AI Act transparency require that patients know when AI is used in their care.

What Makes Healthcare AI Different?

Sector-specific considerations

Triple Regulated

Healthcare AI falls under AI Act, MDR and national healthcare legislation

Clinical Evidence Required

Technical compliance alone is not enough — clinical validation is essential

Human Lives at Stake

Errors in medical AI can directly lead to harm or death

Doctor-Patient Relationship

AI must not undermine the therapeutic relationship

Healthcare regulation

Regulatory Overlap

How the AI Act connects with existing healthcare regulation

MDR / IVDR

Overlap: CE marking, clinical evaluation, post-market surveillance

Practical tip: AI Act conformity assessment and MDR clinical evaluation can partially run in parallel — coordinate with your notified body

GDPR

Overlap: Sensitive health data, DPIA, patient rights

Practical tip: FRIA can partially overlap with DPIA — combine where possible, but mind the specific requirements for health data

NEN 7510

Overlap: Healthcare information security, access control, logging

Practical tip: Your NEN 7510 certification covers part of the AI Act data governance requirements (Art. 10)

Wgbo

Overlap: Physician duty to inform, consent, patient records

Practical tip: AI Act transparency requirements reinforce the Wgbo duty to inform — integrate into existing informed consent procedures

Wkkgz

Overlap: Quality, safety, incident reporting

Practical tip: Report AI incidents through existing Wkkgz procedures, but add AI-specific monitoring

Ready to Start AI Act Compliance?

Practical tools and guidance for healthcare institutions

Free 30-minute orientation call

or

Practical updates on AI governance in healthcare