Article 13 of 11312%
Article 13: Transparency and provision of information to deployers
EU Official:
⏳Applies from 2 Aug 2026
Title III: High-Risk AI SystemsEntry into force: 2026-08-02
Official text
||
Source: EUR-Lex, Regulation (EU) 2024/1689 — text reproduced verbatim.
📥 Download AI Act (PDF)✅ Compliance Checklist
- ☐Instructions for use prepared with intended purpose
- ☐Performance level and limitations described
- ☐Provider information included
- ☐Risks to health, safety and fundamental rights described
- ☐Human oversight measures described
Want to save your progress? Create a free account
Related Recitals
⚖️ Related Enforcement
No enforcement actions for this article yet. Follow developments via the Enforcement Tracker.
Cross-references
Annexes
Frequently asked questions
What information must accompany a high-risk AI system?▼
Article 13 requires high-risk AI systems to be accompanied by clear instructions for use with information about the provider, intended purpose, performance level, known limitations and risks.
Do SMEs also need to comply with Article 13 of the AI Act?▼
Article 13 of the AI Act does not provide a general exemption for SMEs. However, the AI Act includes supportive measures and potentially lighter obligations for small and medium-sized enterprises, depending on their role in the AI value chain.
How does Article 13 of the AI Act relate to the GDPR?▼
Article 13 of the AI Act complements the GDPR. While the GDPR protects personal data, the AI Act focuses on the safety and trustworthiness of AI systems. Organisations must comply with both regulations when their AI system processes personal data.
What are the deadlines for Article 13 of the AI Act?▼
The AI Act follows a phased implementation. Prohibited AI practices apply from February 2025, obligations for high-risk AI systems from August 2026, and other provisions take effect gradually. The specific deadline for Article 13 depends on the category of the obligation.
Does Article 13 of the AI Act also apply to AI systems I purchase?▼
Yes, Article 13 of the AI Act may also be relevant when you purchase AI systems. As a deployer, you have your own obligations under the AI Act, regardless of whether you developed the system yourself or purchased it from a provider.
What is the difference between provider and deployer under Article 13 of the AI Act?▼
Under Article 13 of the AI Act, the provider is the entity that develops or places the AI system on the market, while the deployer is the entity that uses the system under its own authority. Both roles carry different obligations.
What documentation does Article 13 of the AI Act require?▼
Article 13 of the AI Act requires that relevant documentation is maintained as part of the compliance process. This may include technical documentation, instructions for use, logs or declarations of conformity, depending on the classification of the AI system.
How do I document compliance with Article 13 of the AI Act?▼
You document compliance with Article 13 of the AI Act by establishing a risk management system, maintaining technical documentation, and conducting internal audits. Keep all relevant documents for the period prescribed by the AI Act.
What does transparency concretely mean for a high-risk AI system?▼
Article 13 requires high-risk AI systems to be designed so their operation is sufficiently transparent to deployers. Concretely this means: clear instructions for use, information about performance and limitations, explanation of expected inputs, and information about circumstances in which the system may make errors.
Do I need to publish the accuracy and error rates of my AI system?▼
Yes, Article 13(3) requires that instructions for use contain information about the AI system's performance metrics, including accuracy, robustness and cybersecurity. This must be specified for the intended target group and conditions of use.
How does Article 13's transparency obligation relate to trade secrets?▼
Article 13 requires transparency towards deployers and supervisory authorities, not the general public. Trade secrets and intellectual property are protected, but you must share sufficient information so deployers can use the system responsibly and authorities can assess compliance.
📬 AI Act Weekly
Get the most important AI Act developments in your inbox every week.
Subscribe