Article 26 of 11323%
Article 26: Obligations of deployers of high-risk AI systems
EU Official:
⏳Applies from 2 Aug 2026
Title III: High-Risk AI SystemsEntry into force: 2026-08-02
Official text
||
Source: EUR-Lex, Regulation (EU) 2024/1689 — text reproduced verbatim.
📥 Download AI Act (PDF)🎯 What does this mean for you?
🏭 Provider▼
Article 26 primarily targets deployers, but as a provider you must supply them with all information needed to fulfil their obligations (instructions for use, risks, limitations).
🏢 Deployer▼
This is YOUR core obligation. You must: (1) take technical and organisational measures, (2) verify input data relevance, (3) monitor operation, (4) retain logs, (5) inform affected persons, and (6) conduct a FRIA if required.
🏪 SME / Startup▼
Focus on the basics: know which high-risk AI you use, train your staff, and maintain contact with your provider. Most obligations can be covered through good vendor management.
🏛️ Public Sector▼
Pay extra attention to the FRIA obligation (Art. 27) and the duty to inform citizens. Ensure your algorithm description in the Algorithm Register aligns with Art. 26 requirements.
✅ Compliance Checklist
- ☐Technical and organisational measures taken
- ☐Input data verified for relevance
- ☐AI system operation is monitored
- ☐Logs retained in accordance with Art. 19
- ☐Affected persons informed about AI use
- ☐FRIA conducted (if required, Art. 27)
- ☐Staff with AI literacy (Art. 4)
Want to save your progress? Create a free account
📖 Related Recitals
Recital 93Primary
Whilst risks related to AI systems can result from the way such systems are designed, risks can as well stem from how such AI systems are used. Deployers of high-risk AI system therefore play a critic…
Recital 94Primary
Any processing of biometric data involved in the use of AI systems for biometric identification for the purpose of law enforcement needs to comply with Article 10 of Directive (EU) 2016/680, that allo…
🛠 Related tools
⚖️ Related Enforcement
- •
- •
Cross-references
Annexes
Frequently asked questions
What are the obligations for deployers of high-risk AI?▼
Article 26 requires deployers to take technical and organisational measures, conduct a FRIA (for public organisations), monitor the system's operation, and report serious incidents.
Do I need to conduct a FRIA as a deployer?▼
Public organisations and private organisations providing public services must conduct a Fundamental Rights Impact Assessment (FRIA) before deploying high-risk AI. This is regulated in Article 27.
What should I do in case of an incident with high-risk AI?▼
In case of serious incidents, the deployer must report the incident to the provider and the relevant market surveillance authority without undue delay.
Do SMEs also need to comply with Article 26 of the AI Act?▼
Article 26 of the AI Act does not provide a general exemption for SMEs. However, the AI Act includes supportive measures and potentially lighter obligations for small and medium-sized enterprises, depending on their role in the AI value chain.
How does Article 26 of the AI Act relate to the GDPR?▼
Article 26 of the AI Act complements the GDPR. While the GDPR protects personal data, the AI Act focuses on the safety and trustworthiness of AI systems. Organisations must comply with both regulations when their AI system processes personal data.
What are the deadlines for Article 26 of the AI Act?▼
The AI Act follows a phased implementation. Prohibited AI practices apply from February 2025, obligations for high-risk AI systems from August 2026, and other provisions take effect gradually. The specific deadline for Article 26 depends on the category of the obligation.
Does Article 26 of the AI Act also apply to AI systems I purchase?▼
Yes, Article 26 of the AI Act may also be relevant when you purchase AI systems. As a deployer, you have your own obligations under the AI Act, regardless of whether you developed the system yourself or purchased it from a provider.
What is the difference between provider and deployer under Article 26 of the AI Act?▼
Under Article 26 of the AI Act, the provider is the entity that develops or places the AI system on the market, while the deployer is the entity that uses the system under its own authority. Both roles carry different obligations.
What should I as deployer do if my AI vendor goes bankrupt?▼
As a deployer, you remain responsible for complying with your obligations even if the provider is no longer available. Ensure you have access to technical documentation, instructions for use and system logging. Consider contractual arrangements for source code escrow and transfer of responsibilities.
When do I as deployer become a provider myself under the AI Act?▼
You become a provider as a deployer when you: place the AI system on the market under your own name, substantially change the intended purpose of an already marketed system, or make a substantial modification to the system. In those cases, all provider obligations apply.
Do I as deployer need to retain the input and output data of the AI system?▼
Yes, Article 26(6) requires deployers of high-risk AI systems to retain automatically generated logs, insofar as these are under their control. The retention period is at least six months, unless other legislation (such as the GDPR) prescribes a different period.
📬 AI Act Weekly
Get the most important AI Act developments in your inbox every week.
Subscribe