Responsible AI Platform
πŸ₯Healthcare

The Story of MediCheck

About a medtech startup that discovered CE marking alone was no longer enough

Fictional scenario β€” based on realistic situations

Scroll
01

The Trigger

How it started

πŸ“§

MediCheck had just received CE marking for their SkinScan AI under the Medical Device Regulation. The team was celebrating. Finally, they could go to market. But the joy was short-lived.

An additional layer of legislation. Jumping through hoops again. And the question: could they afford to wait? GPs were counting on them. So were patients.

β€œ
The call from their notified body was unexpected: "Your AI now also falls under the AI Act. We need to reconsider your file."
02

The Questions

What did they need to find out?

1Question

What changes for medical AI under the AI Act?

The team dove into the legal text. The AI Act automatically classifies medical devices with AI components as high-risk AI. But what did that mean concretely? The MDR already set requirements for clinical validation, risk management, and post-market surveillance. Where was the added value?

πŸ’‘ The insight

The AI Act adds specific requirements that the MDR doesn't cover: requirements for training data (representativeness, bias detection), transparency to operators (how does the user understand what the AI does?), and human oversight (can the doctor override the AI?). They are complementary regulations, not replacements.

🌍 Why this matters

Many medtech companies think their MDR compliance is sufficient. But the AI Act asks for a different lens: not just "does the system work?" but also "is it fair, understandable, and controllable?". These dimensions are receiving increasing attention from notified bodies.

2Question

Do we need to redo our conformity assessment?

This was the question that made everyone nervous. CE marking was just in. Months of work, high costs. Did everything need to be redone? The team contacted their notified body for clarity.

πŸ’‘ The insight

The answer was nuanced. A full reassessment wasn't necessary, but the technical file needed to be supplemented. Specifically: documentation on training data, bias testing, and how human oversight was built in. The notified body would include this in the next periodic review.

🌍 Why this matters

The EU has deliberately ensured alignment between MDR and AI Act. For high-risk medical AI, conformity assessment takes place via the existing MDR route, but with additional AI Act elements. This prevents duplicate work but does require expansion of the file.

3Question

How do we combine MDR and AI Act compliance?

The team made a mapping: which MDR requirements already covered AI Act obligations, and where were the gaps? They went through a checklist of both regulations, point by point.

πŸ’‘ The insight

There was more overlap than expected. Risk management, clinical evaluation, and post-market surveillance were largely covered. The gaps were in: specific data governance requirements (training data documentation), transparency to the operator (more than just the IFU), and systematic bias testing.

🌍 Why this matters

For startups and small medtech companies, the overlap is good news. You don't need to run two parallel compliance tracks. But you need to understand where the AI Act goes further than what you're already doing for MDR, and add those elements to your QMS and technical file.

4Question

Can we handle this as a small startup?

MediCheck had 25 employees. No large legal department, no dedicated compliance team. The question was real: did they have the capacity to take this on? And what if they couldn't?

πŸ’‘ The insight

The answer lay in prioritization and collaboration. The team decided to hire external expertise for the gap analysis, focus internally on data governance (where they had domain knowledge), and work with their notified body as a partner rather than adversary.

🌍 Why this matters

Many startups fear AI Act compliance will overwhelm them. But the key is: start early, integrate it into your development process, and see it as quality improvement. Companies that treat compliance as an afterthought struggle most. Companies that build it in from the start find it becomes part of good engineering.

03

The Journey

Step by step to compliance

Step 1 of 6
πŸ“¬

The message from the notified body

The notified body that had assessed their MDR file made contact. With the AI Act coming, they needed to supplement their technical file. The question wasn't if, but how quickly.

Step 2 of 6
πŸ”„

Understanding the overlap

The AI Act and the MDR are two different regulations, but for medical AI systems they overlap. SkinScan AI, as a medical device, already fell under strict requirements β€” the AI Act added new dimensions.

Step 3 of 6
πŸ”

The gap analysis

The team went through AI Act requirements point by point. Where did they already comply thanks to MDR compliance? Where were the gaps? Risk management was largely covered, but there were questions about data governance.

Step 4 of 6
πŸ“Š

Examining the dataset

One of the AI Act requirements concerns the quality and representativeness of training data. The team looked critically at their dataset. Was it diverse enough? Did it represent all patient groups that would be screened in practice?

Step 5 of 6
βš–οΈ

The bias discussion

The analysis revealed an uncomfortable truth: the training data predominantly contained images of lighter skin types. Would the algorithm perform equally well for people with darker skin? This wasn't just a compliance question β€” it was an ethical one.

Step 6 of 6
🌍

Expanding the dataset

MediCheck decided to expand the dataset. They worked with clinics in various countries to collect images that better reflected the diversity of the population. The model had to be retrained and validated.

04

The Obstacles

What went wrong?

Obstacle 1

βœ— Challenge

The training data wasn't diverse enough

↓

βœ“ Solution

Collaboration with international clinics to build a more representative dataset

Obstacle 2

βœ— Challenge

GPs interpreted AI output as definitive diagnosis

↓

βœ“ Solution

UI adjustments with clear disclaimers and mandatory confirmation steps

Obstacle 3

βœ— Challenge

The notified body had limited experience with AI Act requirements

↓

βœ“ Solution

Proactive knowledge sharing and joint learning process

β€œ
The AI Act forced us to look critically at our data. Our system became better because of it β€” for all patients.
β€” Dr. Sarah Okonkwo, CTO, MediCheck
05

The Lessons

What can we learn from this?

Les 1 / 4
βš•οΈ

Medical AI has double compliance obligations

If you develop AI for healthcare, you need to account for both MDR and the AI Act. This can be complex, but the regulations reinforce each other.

Les 2 / 4
🌈

Data diversity is not optional

Training data must be representative of all patient groups your system will encounter. This is not a luxury β€” it's a prerequisite for responsible AI.

Les 3 / 4
πŸ’¬

User communication is crucial

How you present AI to users influences how they interact with it. Be explicit about what the system can and cannot do.

Les 4 / 4
πŸš€

Compliance can improve innovation

The extra scrutiny the AI Act brought led to a better product. Stricter requirements strengthen trust β€” with regulators and patients alike.

Are you building AI for healthcare?

Discover which AI Act requirements apply specifically to your sector.