A practical guide based on the guidance from the Dutch Data Protection Authority
Important deadline: The European AI Act imposes an explicit duty of care on organizations from February 2, 2025, regarding the AI literacy of their employees.
Why AI Literacy is Crucial Now
With the arrival of the European AI Act, organizations are on the verge of a new compliance challenge. Starting February 2, 2025, you are required to demonstrate that your employees possess an "adequate level of AI literacy." This means that anyone who selects, trains, manages, or interprets the output of an AI system on behalf of your organization must have sufficient knowledge, skills, and critical awareness to do so responsibly.
To support organizations in this effort, the Dutch Data Protection Authority (AP) has published the guidance document "Getting Started with AI Literacy." This guide offers a clear roadmap to systematically cultivate a culture of AI maturity.
What Exactly is AI Literacy?
AI literacy is a broad concept that extends beyond mere technical knowledge. It encompasses a mix of competencies essential for the responsible use of artificial intelligence.
The Four Pillars of AI Literacy
An AI-literate employee understands not only the technology but also the context in which it operates. This includes:
- Technical Competencies: A basic understanding of how AI systems work.
- Ethical Competencies: The ability to assess the impact of AI on people and society.
- Legal Competencies: Knowledge of relevant laws and regulations, such as the EU AI Act and GDPR.
- Practical Competencies: Recognizing risks like bias, correctly interpreting results, and knowing when to escalate issues.
The Dutch DPA emphasizes that the required level of knowledge depends on the role, context, and risk of the AI system. A data scientist needs a more in-depth understanding than an HR advisor using an AI tool for recruitment.
The Dutch DPA's Four-Step Model for a Structured Approach
The Dutch DPA introduces a practical four-step model that functions as an iterative cycle: identify, set goals, execute, and evaluate. It is designed to help organizations build AI literacy in a structured and sustainable manner.
Identify
Map systems, risks, and competencies.
Set Goals
Establish measurable and realistic improvement targets.
Execute
Implement training, governance, and policies.
Evaluate
Measure progress, report, and adjust.
Step 1: Identify ā The Foundation of Your Strategy
The first step is to create a clear overview of the current situation within your organization. Without a proper diagnosis, you cannot develop an effective strategy. This process involves several key activities.
A Thorough Inventory
Start by creating a central register of all AI systems in use. Where relevant, link this register to your existing GDPR processing register. For each system, document its purpose, users, the data it processes, and its technical specifications.
Next, analyze the risk level of each system. Use the risk categories from the EU AI Act as a guideline (e.g., unacceptable, high, limited, minimal risk) and focus on the potential impact on individuals and society.
Simultaneously, it is essential to map the current knowledge and skill levels of employees. A baseline measurement, for example, through surveys or interviews, will reveal where competencies are lacking. Finally, roles and responsibilities must be clearly defined. Who develops, who decides, and who monitors? Clear ownership and transparent escalation lines are indispensable.
Step 2: Set Goals ā From Insight to Ambition
With the insights from the identification phase, you can formulate concrete and measurable ambitions. This provides focus and makes progress transparent.
SMART Goals and KPIs
Translate your findings into SMART goals (Specific, Measurable, Achievable, Relevant, Time-bound) for different roles and risk levels. Prioritize high-risk systems, such as AI that makes decisions about job applicants or access to public services. Describe the knowledge and skills each stakeholder needs to achieve an "adequate level" of literacy and define who is managerially responsible for achieving these goals.
To measure progress, you can establish relevant Key Performance Indicators (KPIs).
KPI | Possible Target | Relevance to EU AI Act |
---|---|---|
% of employees with basic AI training | ā„ 90% | Demonstrates "sufficient AI literacy" (Article 4). |
Number of high-risk systems with a full risk assessment | 100% | Strengthens compliance and risk management. |
Average score on AI governance maturity model | ā„ 3 out of 5 | Measures the structural embedding of responsible AI use. |
Step 3: Execute ā From Ambition to Action
This phase is about the actual implementation of measures. The Dutch DPA's guidance suggests a pragmatic action program focusing on four areas:
- Training & Awareness: Develop a basic e-learning module for all employees and offer in-depth training or bootcamps for specialists. Use role-specific case studies to increase relevance.
- Governance & Integration: Make AI risks a fixed agenda item in management meetings and integrate the topic into existing risk committees.
- Transparency & Communication: Publish an 'AI register' on the intranet with information about the systems used, their purpose, and a contact person. A dashboard with KPIs can visualize progress.
- Culture & Vision: Develop a clear vision document on how the organization deals with AI, based on core principles such as fairness, transparency, and human oversight.
Tip: Appoint an AI Officer or assign clear ownership to a role like the Chief Data Officer. Shared responsibility often leads to no responsibility.
Step 4: Evaluate ā Measure, Learn, and Adjust
AI literacy is not a one-time project but a continuous process. Technology, applications, and regulations are constantly evolving. A Plan-Do-Check-Act (PDCA) cycle is therefore essential.
The PDCA Cycle for Continuous Improvement
- Plan: Set new learning objectives based on evaluations.
- Do: Implement training and process improvements.
- Check: Audit processes, measure KPIs, and gather feedback.
- Act: Adjust targets and expand the program.
In concrete terms, this means that AI literacy must become a fixed part of periodic risk analyses and audits. Measure residual risks and propose additional measures where necessary. Repeat the baseline measurement annually to quantify knowledge growth and report the results to management to keep the topic on the agenda.
Practical Tips for a Successful Start
The Dutch DPA emphasizes that perfection is not the goal; getting started is what matters most. Start small, for example, with one department or one high-risk AI system. Learn from the experience and then scale up.
Concrete First Steps
Organize an internal brainstorming session to identify all AI applications (including hidden ones). Create a simple spreadsheet to start your AI register and link AI risks to your existing risk management processes to avoid duplication of effort. By linking AI literacy to employees' personal development plans, you make it an integral part of the organizational culture.
From Compliance to Competitive Advantage
AI literacy is more than a compliance checkbox. Organizations that invest in it now are building a sustainable competitive advantage. They make better AI choices, prevent costly mistakes, and build trust with customers and stakeholders. The guidance from the Dutch DPA provides a clear roadmap. The question is not whether to start, but when.
Ready to get started? You can download the complete guidance "Getting Started with AI Literacy" from the Dutch Data Protection Authority's website and begin with step 1 today.
This article is an initiative of geletterdheid.ai. We help organizations navigate the complexity of the EU AI Act and build responsible AI practices. Do you have questions about AI literacy in your organization? Contact us.