Responsible AI Platform
Article 55 of 11349%

Article 55: Obligations of providers of general-purpose AI models with systemic risk

EU Official:
Title V: General-Purpose AI Models

Official text

||

Source: EUR-Lex, Regulation (EU) 2024/1689 — text reproduced verbatim.

📥 Download AI Act (PDF)

⚖️ Related Enforcement

No enforcement actions for this article yet. Follow developments via the Enforcement Tracker.

Related articles

Frequently asked questions

What extra obligations apply to GPAI models with systemic risk under Article 55?
Article 55 requires providers of GPAI models with systemic risk to perform model evaluations, assess and mitigate systemic risks, report serious incidents, and ensure an adequate level of cybersecurity.
Must providers of GPAI with systemic risk report incidents?
Yes, Article 55 requires providers to report serious incidents and possible corrective measures without delay to the AI Office and competent authorities.
Do SMEs also need to comply with Article 55 of the AI Act?
Article 55 of the AI Act does not provide a general exemption for SMEs. However, the AI Act includes supportive measures and potentially lighter obligations for small and medium-sized enterprises, depending on their role in the AI value chain.
How does Article 55 of the AI Act relate to the GDPR?
Article 55 of the AI Act complements the GDPR. While the GDPR protects personal data, the AI Act focuses on the safety and trustworthiness of AI systems. Organisations must comply with both regulations when their AI system processes personal data.
What are the deadlines for Article 55 of the AI Act?
The AI Act follows a phased implementation. Prohibited AI practices apply from February 2025, obligations for high-risk AI systems from August 2026, and other provisions take effect gradually. The specific deadline for Article 55 depends on the category of the obligation.
Does Article 55 of the AI Act also apply to AI systems I purchase?
Yes, Article 55 of the AI Act may also be relevant when you purchase AI systems. As a deployer, you have your own obligations under the AI Act, regardless of whether you developed the system yourself or purchased it from a provider.
What is the difference between provider and deployer under Article 55 of the AI Act?
Under Article 55 of the AI Act, the provider is the entity that develops or places the AI system on the market, while the deployer is the entity that uses the system under its own authority. Both roles carry different obligations.
What documentation does Article 55 of the AI Act require?
Article 55 of the AI Act requires that relevant documentation is maintained as part of the compliance process. This may include technical documentation, instructions for use, logs or declarations of conformity, depending on the classification of the AI system.
What are the additional obligations for GPAI models with systemic risk?
On top of the basic obligations under Article 53, providers of GPAI models with systemic risk must: conduct model evaluations, perform adversarial testing (red teaming), report serious incidents to the AI Office, and ensure adequate cybersecurity measures.
When is a GPAI model considered a systemic risk model?
A GPAI model is presumed to have systemic risk if cumulative training compute exceeds 10^25 FLOP (Article 51). The European Commission can also classify a model as systemic risk based on other criteria from Annex XIII (such as 10,000+ business users).
Which models are currently classified as GPAI with systemic risk?
The European Commission maintains a register of GPAI models with systemic risk. Large models like GPT-4, Gemini Ultra and Claude with training above 10^25 FLOP likely fall under this category. Providers must notify the AI Office when their model reaches this threshold.

📬 AI Act Weekly

Get the most important AI Act developments in your inbox every week.

Subscribe