Responsible AI Platform
High-risk Sector

AI Act Compliance for Education

Admission, assessment and student tracking — high-risk under the AI Act

Practical guidelines for schools, universities and other educational institutions to comply with the EU AI Act.

View the compliance checklist

Why Take Action Now?

The AI Act has major impact on education

August 2025

First obligations for high-risk AI systems come into effect

Education AI = High-risk

AI for admission and assessment automatically falls under strictest rules

Impact on Future

Education AI influences future opportunities for students

Parent Rights

Students and parents have right to transparency about AI use

High-risk AI in Education

These AI applications fall under strict AI Act requirements (Annex III)

Admission & Selection

AI systems selecting students for programs or determining who is admitted.

Selection proceduresDecentralized selectionAdmission adviceMatching algorithms

Assessment & Exams

Systems grading essays, reviewing exams or determining grades.

Automated essay scoringExam grading softwarePlagiarism detectionPeer review matching

Student Tracking Systems

AI tracking student performance, predicting dropout or advising interventions.

Early warning systemsProgress monitoringDropout predictionLearning analytics

Personalized Learning

Adaptive learning systems adjusting content to individual students.

Adaptive learning platformsLevel assessmentsLearning style analysisRemedial teaching AI

Specific Challenges for Educational Institutions

The AI Act brings unique compliance questions for the education sector

Age & Vulnerability

Minors are extra vulnerable. Decisions about children carry extra weight under the AI Act.

Teacher in the Loop

AI may support teachers, not replace them. How to maintain pedagogical autonomy?

Transparency to Parents

Parents and students must understand how AI influences their educational path.

Massive Shadow AI Usage

Teachers use ChatGPT for grading, students for assignments. Invisible, uncontrolled.

Limited Budgets

Educational institutions often have limited resources for compliance infrastructure.

AI Literacy Dual Role

Institutions must teach AI literacy (Art. 4) AND comply themselves as AI users.

AI Act Compliance Roadmap

Practical steps for educational institutions

1

EdTech Inventory

2-3 weeks

Map all AI systems in education and administration, including shadow AI.

2

Impact on Students

2-4 weeks

Determine per system the impact on student opportunities and rights.

3

Vendor Assessment

3-6 weeks

Assess EdTech vendors on AI Act compliance and transparency.

4

Pedagogical Safeguards

2-4 months

Implement protocols for teacher involvement and human oversight.

5

Monitoring & Evaluation

Ongoing

Set up ongoing monitoring for impact on learning outcomes and equality.

15-month trajectory

Implementation Roadmap

Detailed 6-phase timeline with concrete deliverables

1

Inventory

Month 1-2
Complete AI system registerShadow AI mappedEdTech vendor list
2

Classification

Month 2-3
High-risk vs. limited/minimal per systemProctoring classificationJustification per system
3

Gap Analysis

Month 3-5
Per high-risk system: gap between current state and AI Act requirementsGDPR/AI Act overlap analysis
4

Governance Framework

Month 5-7
AI governance structureRoles & responsibilitiesPolicies & proceduresAI literacy curriculum
5

Implementation

Month 7-12
Technical adjustmentsDocumentationConduct FRIAsStaff AI literacy training
6

Audit-ready

Month 12-15
Internal auditDry-run for Inspectorate/DPAContinuous monitoring operational

AI System Inventory Guide

Typical AI systems in education and their likely classification

Important: Many systems do NOT become high-risk if they are purely supportive (teacher decides). Avoid unnecessary compliance costs by classifying carefully.

Admission & Selection

Usually high-risk
Selection algorithmsMatching algorithmsAdmission scoringIntake advice

Annex III — automatically high-risk for admission and access to education

Assessment & Exams

Often high-risk
Essay scoringPlagiarism detectionGrading softwareAutomated assessment

High-risk if it determines grades, progression or graduation

Student Tracking Systems

Context-dependent
Early warning systemsDropout predictionLearning analyticsProgress monitoring

High-risk if it influences progression decisions; otherwise limited risk

Personalized Learning

Context-dependent
Adaptive platformsLevel assessmentsRemedial teaching AILearning style analysis

Classification depends on impact on student progression and decisions

Proctoring & Surveillance

Likely high-risk
Online proctoringBehavior monitoringBrowser lockdownWebcam surveillance

Highly controversial — emotion recognition features may be banned (Art. 5)

Administration & Planning

Usually minimal risk
SchedulingRoom allocationCapacity planningDocument processing

Minimal risk unless it makes decisions affecting individuals

Classification Decision Tree

Quickly determine the risk classification of your education AI system

Does the system determine admission, enrollment or progression of students?

Yes

Automatically high-risk (Annex III)

No

Go to next question

Does it grade or assess students with impact on their academic path?

Yes

Likely high-risk

No

Go to next question

Does it use biometric data or proctoring?

Yes

High-risk (emotion recognition may be banned)

No

Go to next question

Is it a learning tool without impact on grades or progression?

Yes

Limited/minimal risk

No

Consult an expert for classification

This is a simplified decision tree. Consult your legal team for the definitive classification.

Governance Structure

Recommended organizational structure for AI governance in educational institutions

Board of Directors / Executive Board
AI in Education Committee (Board, IT, Educational Quality, Privacy Officer)
Educational AI Coordinator
ICT & EdTech Team
Data Protection Officer
Examination Board

Educational institutions typically have scattered AI policies across faculties. Centralize governance but leave implementation to the teams.

Key Roles

Educational AI Coordinator

Responsible for responsible AI use in education and alignment with teachers

AI Compliance Officer

Overall monitoring of AI Act compliance across the institution

Data Protection Officer

Ensures student privacy — crucial for minors (GDPR Art. 8)

Human Oversight Officer

Oversight for high-risk systems — required by Art. 14 AI Act

Compliance Checklist for Education AI

Concrete checkpoints for each high-risk AI system in education

AI system registered in EU databaseArt. 49
Risk management system establishedArt. 9
Data governance & student data quality ensuredArt. 10
Technical documentation completeArt. 11
Logging & traceability in placeArt. 12
Transparency to students, pupils and parentsArt. 13
Human oversight by teacher/examiner establishedArt. 14
Accuracy & robustness tested (bias on age/background)Art. 15
FRIA conducted as deployerArt. 27
AI literacy training for staff organizedArt. 4
Conformity assessment completedArt. 43

This checklist applies per high-risk system. Consult your legal team and the Education Inspectorate for sector-specific requirements.

Common Mistakes in Education

Avoid these pitfalls in AI Act implementation

"We don't use AI"

Plagiarism detection, LMS recommendations, schedule optimization — it IS AI. Inventory thoroughly.

Ignoring shadow AI

Teachers use ChatGPT for grading, students for assignments. This also falls under your responsibility.

Not classifying proctoring

Online proctoring is likely high-risk. Emotion recognition features may even be banned.

Forgetting Art. 4 AI literacy

Educational institutions must train staff in AI literacy. This is a legal obligation.

Leaving everything to the vendor

You are responsible as deployer. EdTech vendor compliance does not relieve you of your obligations.

Not informing students/parents

Transparency about AI use is mandatory. Students and parents have the right to explanation.

What Makes Education AI Different?

Sector-specific considerations

Formative Life Phase

Educational decisions determine the future of young people

Children as Users

Minors deserve extra protection and age-appropriate safeguards

Public Interest

Good education is a societal interest, not just an individual right

Pedagogical Relationship

The bond between teacher and student must remain central

Education and privacy regulation

Regulatory Overlap

How the AI Act connects with existing education and privacy regulation

GDPR

Overlap: Minors extra protected (Art. 8), special category data, parental consent

Practical tip: FRIA can partially overlap with DPIA — combine where possible. Pay extra attention to age verification.

Education Laws (WVO, WHW)

Overlap: Quality requirements, exam regulations, student rights

Practical tip: Examination boards need to build AI Act knowledge for oversight of AI-based assessment.

Education Inspectorate

Overlap: Supervision of educational quality and equal opportunities

Practical tip: The Inspectorate will include AI use in quality assessments. Be proactive.

Media Law / Children's Privacy Code

Overlap: Extra protection for minors online

Practical tip: EdTech platforms with minor users must comply with stricter privacy standards.

AI Act Art. 4 (AI Literacy)

Overlap: Institutions must offer AI literacy AND comply themselves

Practical tip: Combine Art. 4 compliance with the education curriculum — kill two birds with one stone.