Responsible AI Platform
🎓Education

The Story of EduAssist

How an EdTech scale-up discovered that AI study advice is strictly regulated

Fictional scenario — based on realistic situations

Scroll
01

The Trigger

How it started

📧

EduAssist had developed StudyPath AI to help students with study choices. Universities were enthusiastic. So were students. But when the AI Act came, education turned out to be one of the high-risk sectors.

Education is explicitly named in Annex III of the AI Act. AI that influences access to education or assesses students is high-risk. And StudyPath AI did exactly that: it influenced which courses students chose and thereby their future career paths.

"Our AI helps students choose the right courses. How is that high-risk?" The CEO didn't understand — until she actually read the AI Act.
02

The Questions

What did they need to find out?

1Question

Why is education AI high-risk?

The team dove into the AI Act. Annex III explicitly mentions AI systems "intended to be used for the purpose of determining access or admission to educational institutions" and systems that "evaluate or monitor the learning outcomes of persons".

💡 The insight

The rationale is clear: AI in education can be life-defining. Wrong study advice can shape a career — or misshape it. The AI Act recognizes this and imposes strict requirements for transparency, explainability, and students' right to request human intervention.

🌍 Why this matters

Educational institutions increasingly rely on AI for study advice, assessment, and student monitoring. But few realize this falls under the AI Act. The first enforcement actions in this sector are widely expected.

2Question

What does this mean for study advice?

StudyPath AI didn't give binding advice — they were recommendations. But the question was: how were those recommendations treated in practice? The team interviewed users.

💡 The insight

Students and study advisors often treated AI recommendations as factual wisdom. "The AI says I should take this course" — as if it were objective truth. The system's impact was greater than the designers had anticipated.

🌍 Why this matters

AI recommendations have an authority that goes beyond their technical basis. Research shows that people often follow AI suggestions uncritically, especially when uncertain. This is precisely why the AI Act demands human oversight and explainability.

3Question

How do we prevent bias in our recommendations?

The team analyzed their training data and output. Were there patterns that disadvantaged certain groups? The results were sobering.

💡 The insight

The system turned out to recommend technical courses more often to male students, and communication-oriented courses to female students. This wasn't explicitly programmed, but a reflection of historical choice patterns in the training data. It was a classic example of bias reproduction.

🌍 Why this matters

AI systems learn from data, and data reflects society — including its prejudices. This is not a theoretical problem: researchers have repeatedly shown that AI reinforces gender stereotypes and other biases. The AI Act requires active bias detection and mitigation.

4Question

What transparency must we provide?

The AI Act requires that users understand they are dealing with AI, and that they understand how the system works. But how do you explain a complex recommendation system to an 18-year-old student?

💡 The insight

The solution lay in layered transparency. A simple explanation for students: "This advice is based on your previous results, your stated interests, and career trends." A more detailed explanation for study advisors. And complete technical documentation for audits.

🌍 Why this matters

Transparency is not one-size-fits-all. Different stakeholders have different information needs. The art is giving each level the information they need to correctly use or evaluate the system.

03

The Journey

Step by step to compliance

Step 1 of 6
📜

The regulatory analysis

A consultant pointed out that education AI falls under Annex III. Management requested a full analysis of the implications.

Step 2 of 6
🎙️

User research

The team interviewed students and study advisors. How were AI recommendations treated in practice? The answers were eye-openers.

Step 3 of 6
⚖️

Bias audit

An external audit revealed gender bias. The model reproduced historical stereotypes in study choices.

Step 4 of 6
🔧

Model correction

The team adjusted the model to mitigate gender bias. This required careful consideration: how do you correct without creating new imbalances?

Step 5 of 6
🖥️

Transparency layer

A new UX was designed that made clear this was AI advice, with explanation of the basis for each recommendation.

Step 6 of 6

Student rights

A process was set up for students to request human review of AI recommendations.

04

The Obstacles

What went wrong?

Obstacle 1

Challenge

The model reproduced gender stereotypes from the training data

Solution

Bias mitigation techniques and monitoring of output distribution per demographic group

Obstacle 2

Challenge

Students often followed AI advice uncritically

Solution

UX adjustments emphasizing these are suggestions, not prescriptions

Obstacle 3

Challenge

Educational institutions had no AI governance frameworks

Solution

Partnership for co-creation of governance guidelines

We thought we were helping students. The AI Act forced us to acknowledge we were also creating risks. Now we're really helping them better.
Lisa de Vries, CEO, EduAssist
05

The Lessons

What can we learn from this?

Les 1 / 4
🎓

Education AI is high-risk

AI that influences educational access or performance falls under Annex III. This applies to many EdTech applications.

Les 2 / 4
💡

Recommendations are not innocent

AI suggestions are often treated as fact. Design must acknowledge and counter this.

Les 3 / 4
⚖️

Historical data contains historical bias

If your model learns from the past, it also learns the prejudices of the past. Active bias detection is essential.

Les 4 / 4
📊

Transparency must be layered

Different users have different information needs. One format doesn't fit everyone.

Are you building AI for education?

Discover how the AI Act affects EdTech and what you need to do for compliance.