The Story of MediCheck
About a medtech startup that discovered CE marking alone was no longer enough
Fictional scenario β based on realistic situations
The Trigger
How it started
MediCheck had just received CE marking for their SkinScan AI under the Medical Device Regulation. The team was celebrating. Finally, they could go to market. But the joy was short-lived.
An additional layer of legislation. Jumping through hoops again. And the question: could they afford to wait? GPs were counting on them. So were patients.
The call from their notified body was unexpected: "Your AI now also falls under the AI Act. We need to reconsider your file."
The Questions
What did they need to find out?
What changes for medical AI under the AI Act?
The team dove into the legal text. The AI Act automatically classifies medical devices with AI components as high-risk AI. But what did that mean concretely? The MDR already set requirements for clinical validation, risk management, and post-market surveillance. Where was the added value?
π‘ The insight
The AI Act adds specific requirements that the MDR doesn't cover: requirements for training data (representativeness, bias detection), transparency to operators (how does the user understand what the AI does?), and human oversight (can the doctor override the AI?). They are complementary regulations, not replacements.
π Why this matters
Many medtech companies think their MDR compliance is sufficient. But the AI Act asks for a different lens: not just "does the system work?" but also "is it fair, understandable, and controllable?". These dimensions are receiving increasing attention from notified bodies.
Do we need to redo our conformity assessment?
This was the question that made everyone nervous. CE marking was just in. Months of work, high costs. Did everything need to be redone? The team contacted their notified body for clarity.
π‘ The insight
The answer was nuanced. A full reassessment wasn't necessary, but the technical file needed to be supplemented. Specifically: documentation on training data, bias testing, and how human oversight was built in. The notified body would include this in the next periodic review.
π Why this matters
The EU has deliberately ensured alignment between MDR and AI Act. For high-risk medical AI, conformity assessment takes place via the existing MDR route, but with additional AI Act elements. This prevents duplicate work but does require expansion of the file.
How do we combine MDR and AI Act compliance?
The team made a mapping: which MDR requirements already covered AI Act obligations, and where were the gaps? They went through a checklist of both regulations, point by point.
π‘ The insight
There was more overlap than expected. Risk management, clinical evaluation, and post-market surveillance were largely covered. The gaps were in: specific data governance requirements (training data documentation), transparency to the operator (more than just the IFU), and systematic bias testing.
π Why this matters
For startups and small medtech companies, the overlap is good news. You don't need to run two parallel compliance tracks. But you need to understand where the AI Act goes further than what you're already doing for MDR, and add those elements to your QMS and technical file.
Can we handle this as a small startup?
MediCheck had 25 employees. No large legal department, no dedicated compliance team. The question was real: did they have the capacity to take this on? And what if they couldn't?
π‘ The insight
The answer lay in prioritization and collaboration. The team decided to hire external expertise for the gap analysis, focus internally on data governance (where they had domain knowledge), and work with their notified body as a partner rather than adversary.
π Why this matters
Many startups fear AI Act compliance will overwhelm them. But the key is: start early, integrate it into your development process, and see it as quality improvement. Companies that treat compliance as an afterthought struggle most. Companies that build it in from the start find it becomes part of good engineering.
The Journey
Step by step to compliance
The message from the notified body
The notified body that had assessed their MDR file made contact. With the AI Act coming, they needed to supplement their technical file. The question wasn't if, but how quickly.
Understanding the overlap
The AI Act and the MDR are two different regulations, but for medical AI systems they overlap. SkinScan AI, as a medical device, already fell under strict requirements β the AI Act added new dimensions.
The gap analysis
The team went through AI Act requirements point by point. Where did they already comply thanks to MDR compliance? Where were the gaps? Risk management was largely covered, but there were questions about data governance.
Examining the dataset
One of the AI Act requirements concerns the quality and representativeness of training data. The team looked critically at their dataset. Was it diverse enough? Did it represent all patient groups that would be screened in practice?
The bias discussion
The analysis revealed an uncomfortable truth: the training data predominantly contained images of lighter skin types. Would the algorithm perform equally well for people with darker skin? This wasn't just a compliance question β it was an ethical one.
Expanding the dataset
MediCheck decided to expand the dataset. They worked with clinics in various countries to collect images that better reflected the diversity of the population. The model had to be retrained and validated.
The Obstacles
What went wrong?
β Challenge
The training data wasn't diverse enough
β Solution
Collaboration with international clinics to build a more representative dataset
β Challenge
GPs interpreted AI output as definitive diagnosis
β Solution
UI adjustments with clear disclaimers and mandatory confirmation steps
β Challenge
The notified body had limited experience with AI Act requirements
β Solution
Proactive knowledge sharing and joint learning process
The AI Act forced us to look critically at our data. Our system became better because of it β for all patients.
The Lessons
What can we learn from this?
Medical AI has double compliance obligations
If you develop AI for healthcare, you need to account for both MDR and the AI Act. This can be complex, but the regulations reinforce each other.
Data diversity is not optional
Training data must be representative of all patient groups your system will encounter. This is not a luxury β it's a prerequisite for responsible AI.
User communication is crucial
How you present AI to users influences how they interact with it. Be explicit about what the system can and cannot do.
Compliance can improve innovation
The extra scrutiny the AI Act brought led to a better product. Stricter requirements strengthen trust β with regulators and patients alike.
Are you building AI for healthcare?
Discover which AI Act requirements apply specifically to your sector.
Ga verder met leren
Ontdek gerelateerde content