AI Literacy 2026
From Article 4 obligation to demonstrable compliance โ the complete step-by-step plan for organisations. Based on Dutch DPA guidance and Article 4 of the EU AI Act.
1. Executive summary
Since 2 February 2025, AI literacy is a legal obligation for every organisation deploying AI. Article 4 of the EU AI Act requires organisations to demonstrably ensure that their staff โ including contractors โ possess sufficient knowledge and skills to work responsibly with AI.
The Dutch Data Protection Authority (AP) published guidance in March 2025 describing a concrete four-step model: identify, set goals, execute, and evaluate. This is not a one-off project but an iterative process.
This handbook translates the legal obligation into a workable plan. You will find: a role matrix with competence profiles, a 12-month implementation plan, measurable KPIs for compliance reporting, and a complete audit-ready checklist.
- Compliance officers and legal advisors responsible for Article 4 compliance
- HR and learning managers tasked with setting up training programmes
- Board members and executives ultimately accountable for AI governance
- IT managers who manage and procure AI systems
2. The legal context
The EU AI Act (Regulation 2024/1689) introduces AI literacy as one of the first obligations to take effect. It is not a side note โ it is the foundation on which all other AI Act obligations rest.
Article 4 โ Key points
Enforcement timeline
Non-compliance with Article 4 falls under the AI Act penalty framework. Fines can reach up to โฌ15 million or 3% of annual global turnover.
3. What the DPA expects
The Dutch DPA has not only named the obligation but also provided a practical model. The guidance document "Getting started with AI literacy" (March 2025) describes a cycle of four steps that organisations must follow iteratively.
Map which AI systems are in use, by whom, for what purpose. Including shadow AI and procured tools with AI components. Without this inventory, targeted training is impossible.
Determine per role group what level of AI literacy is required. An end user needs different knowledge than a compliance officer or an executive. Document these levels in competence profiles.
Develop and deliver training that matches the established goals. Combine methods: e-learning for foundational knowledge, workshops for depth, practical exercises with your own AI tools. Document participation and results.
Measure whether goals have been achieved. Use assessments, practical tests and feedback rounds. Adjust the programme based on results and repeat the cycle at least annually.
Source: AP guidance "Getting started with AI literacy", March 2025.
Key point: the DPA expects not a one-time certificate but a continuous, documented improvement process. Organisations that demonstrably implement this stand substantially stronger during an inspection.
4. Role matrix: who needs to know what?
The proportionality principle of Article 4 means not everyone needs the same training. The matrix below provides a starting point for establishing competence profiles per role group.
- AI Act essentials and liability
- Risk appetite and governance structure
- Fine exposure and reputational risk
- Decision-making on AI investments
- In-depth article knowledge (Art. 4, 9, 14, 26, 50)
- FRIA and DPIA execution
- Incident reporting and escalation
- Interaction with supervisory authorities
- Establishing competence profiles
- Training plan and administration
- Certificate management and progress monitoring
- Shadow AI policy and onboarding
- AI system inventory and classification
- Data quality and bias monitoring
- Model governance and version management
- Vendor assurance and SLA monitoring
- Correct use of AI tools
- Output verification and bias recognition
- Generative AI do's and don'ts
- When and how to escalate
- Provider vs deployer roles
- Art. 50 transparency in contracts
- Audit rights and vendor declarations
- Identifying AI components in SaaS
5. The 12-month implementation plan
This plan distributes the implementation across four quarters. Each quarter builds on the previous โ from inventory to demonstrable certification.
Inventory and baseline
- AI register of all systems in use
- Role matrix: who uses what, with what impact
- Baseline AI literacy measurement per team
- Gap analysis: current vs required level per role
Goals and policy
- AI literacy plan adopted by board
- Competence profiles per role group
- Policy on shadow AI, procurement and generative AI
- Ownership and governance: who monitors this plan?
Roll out training at scale
- Role-targeted training modules rolled out
- Sector and function-specific case studies
- Progress monitoring via dashboard or registration
- Interim assessments per employee
Evaluate and embed
- Re-measurement AI literacy (minimum 85% at required level)
- Complete audit trail: who, what, when, result
- Annual evaluation report for board
- Embed in HR onboarding and annual refresher
When executing Q3 and Q4, a structured training platform can save considerable time. Modules, assessments and certification are ready to go โ including a progress dashboard and exportable reports. The AI Academy platform is specifically designed for this purpose and aligns with the DPA four-step model.
6. Measuring and documenting
Compliance requires not just training, but demonstrating its effectiveness. The DPA expects a verifiable evidence trail.
Quantitative indicators
- Completion rate per role and department
- Assessment scores and improvement trends
- Time-to-competency for new employees
- Number of AI incidents reported (increase may indicate better awareness)
- Reduction in AI-related policy violations
Documentation for supervisory authorities
- Training programme design document with learning objectives
- Participation records with dates and completion status
- Assessment and exam results per employee
- Certificate records with timestamps
- Update log: how the programme adapted to new developments
- Evidence of board involvement and resource allocation
7. Common mistakes
Based on DPA guidance and practical experience, these are the five most common pitfalls:
A single workshop in 2025 is not structural embedding. The DPA expects an iterative process with annual evaluation and adjustment.
The proportionality principle exists for a reason. A developer needs different knowledge than a receptionist. Uniform training wastes resources and fails to build real competence.
AI literacy applies to everyone who works with AI โ including management, HR, legal, procurement and customer-facing roles. Excluding non-technical roles creates blind spots.
Without records of who was trained, when, and with what result, evidence is missing during an inspection. "We did a workshop" without documentation is insufficient.
Freelancers, temp workers and external consultants who use your AI systems fall under your responsibility. Include them in the training programme.
8. Audit-ready checklist
Use this checklist to assess whether your organisation is ready for an Article 4 compliance inspection.
Inventory
- All AI systems inventoried (including SaaS with AI features)
- Per system documented: purpose, users, data, impact on affected people
- Shadow AI policy established and communicated
Competence profiles
- Required knowledge level established per role group
- Competence profiles approved by management
- Profiles linked to job descriptions
Training
- Training programme designed per role group
- Multiple learning methods deployed (e-learning, workshop, practical)
- All employees working with AI have participated
- External staff (freelancers, consultants) are included
Assessment and certification
- Assessments conducted per employee
- Results recorded with timestamps
- Certificates or completion records available
- Minimum 85% of the target group has achieved the required level
Governance and embedding
- AI literacy plan adopted by board
- Programme owner appointed
- Evaluation cycle established (minimum annually)
- Programme embedded in HR onboarding
- Update process established for new AI systems and regulations
9. Next step
This handbook provides the framework. The next step is execution. Start with the Q1 inventory โ without a clear picture of your AI landscape, targeted action is impossible.
For organisations looking to accelerate execution, the AI Academy platform offers a ready-made training environment aligned with the four-step model in this handbook. With role-targeted modules, sector-specific case studies, a progress dashboard and per-employee certification.
About Responsible AI Platform
Responsible AI Platform is the leading knowledge platform for the EU AI Act. We publish analyses, practical guides and tools that help organisations implement responsible AI. Our work is based on official EU legislation, DPA guidelines and practical experience.
The AI Academy platform was developed as a practical execution environment for AI literacy under Article 4. Interactive modules, assessments and certification โ designed against the criteria of the Dutch Data Protection Authority.
Sources
- Regulation (EU) 2024/1689 โ EU AI ActEuropean Parliament & Council
- Getting started with AI literacy (guidance)Dutch Data Protection Authority
- AI Impact Barometer / RAN-6 reportDutch Data Protection Authority
- AI Act โ Regulatory framework for AIEuropean Commission
- Public Consultation Implementation Act AI RegulationDigital Government NL