Responsible AI Platform
Article 9 of 1138%
🇳🇱 Nederlands

Article 9: Risk management system

EU Official:
Applies from 2 Aug 2026
Title III: High-Risk AI SystemsEntry into force: 2026-08-02

Official text

||

Source: EUR-Lex, Regulation (EU) 2024/1689 — text reproduced verbatim.

📥 Download AI Act (PDF)

🎯 What does this mean for you?

🏭 Provider
You must establish a continuous risk management system that identifies, analyses and mitigates risks throughout the entire lifecycle. This is not a one-time assessment but an ongoing process. Document everything in your technical documentation (Art. 11).
🏢 Deployer
Ensure you understand and monitor your provider's risk management system. Report deviations from expected behaviour to the provider. Maintain a log of incidents and deviations.
🏪 SME / Startup
Start small: create a risk inventory of your AI use. What data goes in? What decisions come out? What could go wrong? Use our FRIA Generator as a starting point.
🏛️ Public Sector
Combine your AI risk management with existing frameworks such as the IAMA (Human Rights and Algorithms Impact Assessment). The AI Act requirements align with what many governments are already doing.

✅ Compliance Checklist

  • Risk management system established and documented
  • Risk identification and analysis performed
  • Risk mitigation measures implemented
  • Residual risks assessed and deemed acceptable
  • Testing procedures performed before deployment
  • Continuous monitoring system established

Want to save your progress? Create a free account

📖 Related Recitals

🛠 Related tools

⚖️ Related Enforcement

No enforcement actions for this article yet. Follow developments via the Enforcement Tracker.

Cross-references

Recitals

Annexes

Frequently asked questions

What does the risk management system for AI entail?
Article 9 requires a continuous, iterative risk management system that identifies, analyses, evaluates and mitigates risks throughout the entire lifecycle of the AI system.
Is a risk assessment one-time or ongoing?
Risk management must be ongoing. It is not a one-time check but a continuous process that is regularly updated, especially when significant changes are made to the AI system.
Do SMEs also need to comply with Article 9 of the AI Act?
Article 9 of the AI Act does not provide a general exemption for SMEs. However, the AI Act includes supportive measures and potentially lighter obligations for small and medium-sized enterprises, depending on their role in the AI value chain.
How does Article 9 of the AI Act relate to the GDPR?
Article 9 of the AI Act complements the GDPR. While the GDPR protects personal data, the AI Act focuses on the safety and trustworthiness of AI systems. Organisations must comply with both regulations when their AI system processes personal data.
What are the deadlines for Article 9 of the AI Act?
The AI Act follows a phased implementation. Prohibited AI practices apply from February 2025, obligations for high-risk AI systems from August 2026, and other provisions take effect gradually. The specific deadline for Article 9 depends on the category of the obligation.
Does Article 9 of the AI Act also apply to AI systems I purchase?
Yes, Article 9 of the AI Act may also be relevant when you purchase AI systems. As a deployer, you have your own obligations under the AI Act, regardless of whether you developed the system yourself or purchased it from a provider.
What is the difference between provider and deployer under Article 9 of the AI Act?
Under Article 9 of the AI Act, the provider is the entity that develops or places the AI system on the market, while the deployer is the entity that uses the system under its own authority. Both roles carry different obligations.
What documentation does Article 9 of the AI Act require?
Article 9 of the AI Act requires that relevant documentation is maintained as part of the compliance process. This may include technical documentation, instructions for use, logs or declarations of conformity, depending on the classification of the AI system.
How often must the risk management system be updated?
The risk management system must be a continuous, iterative process throughout the entire lifecycle of the AI system. It must be updated when substantial modifications are made, when new risk insights emerge, or when post-market monitoring yields new information.
What is the difference between a DPIA (GDPR) and the risk management system of Article 9?
A DPIA under the GDPR focuses on risks to privacy and data protection. The Article 9 risk management system is broader and covers risks to health, safety and all fundamental rights. Both assessments are complementary and may partially overlap, but do not replace each other.
Do I need to explicitly document residual risks?
Yes, Article 9 requires that residual risks remaining after risk mitigation measures are explicitly identified and documented. These residual risks must be assessed as acceptable and communicated to the deployer through the instructions for use.
Can I combine the AI Act risk management system with ISO 31000 or ISO 23894?
Yes, existing risk management frameworks like ISO 31000 (risk management) and ISO/IEC 23894 (AI risk management) can serve as a foundation. However, you must ensure all specific Article 9 requirements are covered, including the focus on fundamental rights and the obligation to test with representative data.

📬 AI Act Weekly

Get the most important AI Act developments in your inbox every week.

Subscribe