Responsible AI Platform
Article 60 of 11353%

Article 60: Testing of high-risk AI systems in real world conditions outside AI regulatory sandboxes

EU Official:
Title VI: Governance

Official text

||

Source: EUR-Lex, Regulation (EU) 2024/1689 — text reproduced verbatim.

📥 Download AI Act (PDF)

⚖️ Related Enforcement

No enforcement actions for this article yet. Follow developments via the Enforcement Tracker.

Related articles

Frequently asked questions

Can high-risk AI be tested outside sandboxes under Article 60?
Yes, Article 60 allows high-risk AI systems listed in Annex III to be tested in real-world conditions outside sandboxes, provided a test plan is drawn up and approved by the market surveillance authority.
What conditions apply to real-world testing?
The provider must submit a test plan, obtain informed consent from test subjects, and register the test in the EU database. The market surveillance authority may stop the test at any time.
Do SMEs also need to comply with Article 60 of the AI Act?
Article 60 of the AI Act does not provide a general exemption for SMEs. However, the AI Act includes supportive measures and potentially lighter obligations for small and medium-sized enterprises, depending on their role in the AI value chain.
How does Article 60 of the AI Act relate to the GDPR?
Article 60 of the AI Act complements the GDPR. While the GDPR protects personal data, the AI Act focuses on the safety and trustworthiness of AI systems. Organisations must comply with both regulations when their AI system processes personal data.
What are the deadlines for Article 60 of the AI Act?
The AI Act follows a phased implementation. Prohibited AI practices apply from February 2025, obligations for high-risk AI systems from August 2026, and other provisions take effect gradually. The specific deadline for Article 60 depends on the category of the obligation.
Does Article 60 of the AI Act also apply to AI systems I purchase?
Yes, Article 60 of the AI Act may also be relevant when you purchase AI systems. As a deployer, you have your own obligations under the AI Act, regardless of whether you developed the system yourself or purchased it from a provider.
What is the difference between provider and deployer under Article 60 of the AI Act?
Under Article 60 of the AI Act, the provider is the entity that develops or places the AI system on the market, while the deployer is the entity that uses the system under its own authority. Both roles carry different obligations.
What documentation does Article 60 of the AI Act require?
Article 60 of the AI Act requires that relevant documentation is maintained as part of the compliance process. This may include technical documentation, instructions for use, logs or declarations of conformity, depending on the classification of the AI system.

📬 AI Act Weekly

Get the most important AI Act developments in your inbox every week.

Subscribe