What does the risk management system for AI entail?+
Article 9 requires a continuous, iterative risk management system that identifies, analyses, evaluates and mitigates risks throughout the entire lifecycle of the AI system.
Is a risk assessment one-time or ongoing?+
Risk management must be ongoing. It is not a one-time check but a continuous process that is regularly updated, especially when significant changes are made to the AI system.
Do SMEs also need to comply with Article 9 of the AI Act?+
Article 9 of the AI Act does not provide a general exemption for SMEs. However, the AI Act includes supportive measures and potentially lighter obligations for small and medium-sized enterprises, depending on their role in the AI value chain.
How does Article 9 of the AI Act relate to the GDPR?+
Article 9 of the AI Act complements the GDPR. While the GDPR protects personal data, the AI Act focuses on the safety and trustworthiness of AI systems. Organisations must comply with both regulations when their AI system processes personal data.
What are the deadlines for Article 9 of the AI Act?+
The AI Act follows a phased implementation. Prohibited AI practices apply from February 2025, obligations for high-risk AI systems from August 2026, and other provisions take effect gradually. The specific deadline for Article 9 depends on the category of the obligation.
Does Article 9 of the AI Act also apply to AI systems I purchase?+
Yes, Article 9 of the AI Act may also be relevant when you purchase AI systems. As a deployer, you have your own obligations under the AI Act, regardless of whether you developed the system yourself or purchased it from a provider.
What is the difference between provider and deployer under Article 9 of the AI Act?+
Under Article 9 of the AI Act, the provider is the entity that develops or places the AI system on the market, while the deployer is the entity that uses the system under its own authority. Both roles carry different obligations.
What documentation does Article 9 of the AI Act require?+
Article 9 of the AI Act requires that relevant documentation is maintained as part of the compliance process. This may include technical documentation, instructions for use, logs or declarations of conformity, depending on the classification of the AI system.
How often must the risk management system be updated?+
The risk management system must be a continuous, iterative process throughout the entire lifecycle of the AI system. It must be updated when substantial modifications are made, when new risk insights emerge, or when post-market monitoring yields new information.
What is the difference between a DPIA (GDPR) and the risk management system of Article 9?+
A DPIA under the GDPR focuses on risks to privacy and data protection. The Article 9 risk management system is broader and covers risks to health, safety and all fundamental rights. Both assessments are complementary and may partially overlap, but do not replace each other.
Do I need to explicitly document residual risks?+
Yes, Article 9 requires that residual risks remaining after risk mitigation measures are explicitly identified and documented. These residual risks must be assessed as acceptable and communicated to the deployer through the instructions for use.
Can I combine the AI Act risk management system with ISO 31000 or ISO 23894?+
Yes, existing risk management frameworks like ISO 31000 (risk management) and ISO/IEC 23894 (AI risk management) can serve as a foundation. However, you must ensure all specific Article 9 requirements are covered, including the focus on fundamental rights and the obligation to test with representative data.