Why Take Action Now?
The AI Act has major impact on the healthcare sector
August 2025
First obligations for high-risk AI systems come into effect
Medical AI = High-risk
AI for diagnosis and treatment automatically falls under strictest rules
Overlap with MDR
Medical devices with AI must comply with both MDR and AI Act
Physician Remains Responsible
Human oversight requirement demands clear protocols
High-risk AI in Healthcare
These AI applications fall under strict AI Act requirements (Annex III)
Diagnostic AI
AI systems that analyze medical images or support diagnoses — from X-ray analysis to pathology. Often also qualifies as a medical device under MDR.
Treatment Support
Systems providing treatment advice, medication dosing or therapy selection support. Both clinical validation and AI Act conformity are required.
Triage & Care Allocation
AI determining which patients get priority or which care is allocated. Direct impact on access to essential services.
Predictive Models
Systems predicting patient outcomes or assessing risks. Classification depends on degree of autonomy and clinical impact.
Specific Challenges for Healthcare Institutions
The AI Act brings unique compliance questions for the healthcare sector
MDR and AI Act Alignment
Medical devices with AI must comply with both. How to integrate requirements?
Information Security Integration
How does AI Act compliance fit into existing healthcare information security?
Clinical Validation
AI systems must be clinically validated. What evidence is needed?
Physician in the Loop
Human oversight is mandatory. How to ensure physicians make informed decisions?
Vendor Management
Much AI comes from external vendors. What guarantees to request?
Patient Rights & Transparency
Patients have right to explanation. How to communicate about AI use in care?
AI Act Compliance Roadmap
Practical steps for healthcare institutions
AI Inventory
2-4 weeksMap all AI systems. Which systems influence patient care?
Classification & MDR Check
2-3 weeksDetermine AI Act classification and potential MDR class per system.
Gap Analysis
4-8 weeksCompare current documentation with AI Act and MDR requirements.
Clinical Governance
2-4 monthsImplement protocols for human oversight and clinical decision-making.
Monitoring & Vigilance
OngoingSet up post-market surveillance for continuous AI performance monitoring.
Implementation Roadmap
Detailed 6-phase timeline with concrete deliverables for healthcare institutions
Phase 1.AI Inventory
Month 1-2Phase 2.Risk Classification
Month 2-3Phase 3.Gap Analysis
Month 3-5Phase 4.Governance & Protocols
Month 5-8Phase 5.Implementation
Month 8-14Phase 6.Audit & Certification
Month 14-18AI System Inventory Guide
Typical AI systems in healthcare institutions and their likely classification
Note: AI systems in healthcare are often scattered across departments (radiology, lab, clinical). Involve all departments in the inventory — not just IT.
Diagnostic AI
Usually high-riskAnnex III + often MDR medical device — dual compliance required
Clinical Decision Support
Often high-riskHigh-risk when it directly influences clinical decisions
Triage & Allocation
Usually high-riskDirect impact on access to essential care — Annex III category 5
Predictive Models
Context-dependentDepends on autonomy and clinical impact — may be high-risk
Administration & Workflow
Usually minimal riskMinimal risk unless it directly affects patient care
Patient Communication
Limited riskTransparency obligation (Art. 50) — patient must know it is AI
Classification Decision Tree for Healthcare AI
Quickly determine the risk classification of your AI system
Is the AI system a medical device under the MDR?
High-risk — dual compliance (AI Act + MDR)
Go to next question
Does it influence clinical decisions about patients?
Likely high-risk
Go to next question
Is it supportive with physician override (physician always decides)?
Possibly limited risk
Go to next question
Is it purely administrative without impact on patient care?
Minimal risk
Consult an expert for classification
This is a simplified decision tree. Consult your legal team and CMIO for the definitive classification, especially for MDR overlap.
Governance Structure for Healthcare AI
Recommended organizational structure for AI governance in healthcare institutions
Involve clinical leadership from the start — AI governance in healthcare cannot succeed without medical expertise.
Key Roles
CMIO
Bridges clinical practice and IT — responsible for medical AI strategy and clinical validation
AI Compliance Officer
Overall monitoring of AI Act compliance and coordination with regulators
Clinical AI Champion
Physician per department monitoring AI use and ensuring human oversight (Art. 14)
Data Protection Officer
Ensures GDPR compliance for sensitive health data and DPIAs
Compliance Checklist for High-risk Healthcare AI
Concrete checkpoints per high-risk AI system in healthcare
This checklist applies per high-risk system. For medical devices, MDR/IVDR requirements also apply. Consult your CMIO and legal team.
Common Mistakes in Healthcare
Avoid these pitfalls in AI Act implementation for healthcare institutions
Treating all clinical software as high-risk
Many support tools are not. Careful classification saves millions in unnecessary compliance costs.
Forgetting MDR when focusing on AI Act
Medical AI is often also a medical device. Two separate compliance tracks are needed.
Not involving clinicians in compliance
AI Act compliance in healthcare cannot succeed without physicians, nurses and clinical leadership.
Assuming vendor CE marking covers AI Act
CE marking under MDR is different from AI Act conformity. Both must be ensured separately.
Skipping post-market surveillance
Mandatory for high-risk systems. Clinical AI requires continuous performance and safety monitoring.
Not informing patients about AI use
Duty to inform patients and AI Act transparency require that patients know when AI is used in their care.
What Makes Healthcare AI Different?
Sector-specific considerations
Triple Regulated
Healthcare AI falls under AI Act, MDR and national healthcare legislation
Clinical Evidence Required
Technical compliance alone is not enough — clinical validation is essential
Human Lives at Stake
Errors in medical AI can directly lead to harm or death
Doctor-Patient Relationship
AI must not undermine the therapeutic relationship
Regulatory Overlap
How the AI Act connects with existing healthcare regulation
MDR / IVDR
Overlap: CE marking, clinical evaluation, post-market surveillance
Practical tip: AI Act conformity assessment and MDR clinical evaluation can partially run in parallel — coordinate with your notified body
GDPR
Overlap: Sensitive health data, DPIA, patient rights
Practical tip: FRIA can partially overlap with DPIA — combine where possible, but mind the specific requirements for health data
NEN 7510
Overlap: Healthcare information security, access control, logging
Practical tip: Your NEN 7510 certification covers part of the AI Act data governance requirements (Art. 10)
Wgbo
Overlap: Physician duty to inform, consent, patient records
Practical tip: AI Act transparency requirements reinforce the Wgbo duty to inform — integrate into existing informed consent procedures
Wkkgz
Overlap: Quality, safety, incident reporting
Practical tip: Report AI incidents through existing Wkkgz procedures, but add AI-specific monitoring
Related Articles
Deepen your knowledge of AI Act compliance in healthcare
FRIA: Complete Guide to Article 27 AI Act
Everything about the mandatory fundamental rights impact assessment for high-risk AI systems.
DPIA vs FRIA: Practical Comparison
Understand the difference between a DPIA and FRIA and when you need which.
Data Quality and Bias Mitigation
From raw source to robust model — essential for medical AI applications.
Ready to Start AI Act Compliance?
Practical tools and guidance for healthcare institutions