Provider vs Deployer
The most important distinction in the EU AI Act
Understand if you are a provider or deployer. This determines all your obligations under the law.
Official Definitions
What does the law say exactly?
The EU AI Act makes a fundamental distinction between providers and deployers. A provider is the one who develops or has an AI system developed and places it on the market under their own name. A deployer is an organization that uses an AI system professionally. This distinction determines which obligations apply to you.
Provider (Article 3, para 3)
Develops the AI system or markets it under own name/brand
Deployer (Article 3, para 4)
Uses an AI system under own responsibility for professional purposes
Both roles possible
An organization can have different roles for different systems
Role determines obligations
Providers have more and heavier obligations than deployers
Decision Tree: Determine Your Role
In 3 questions you know the answer
Use this decision tree to determine if you are a provider or deployer. Question 1: Did you develop or have the AI system developed? If no → you are probably a deployer. If yes → go to question 2. Question 2: Do you market it under your own name/brand? If yes → you are a provider. If no → question 3. Question 3: Do you use it only internally? If yes → you are still a provider (also for internal use). Note: if you buy an AI tool, significantly modify it, and resell it under your own name, you are both deployer and provider.
Self-developed?
No = Deployer, Yes = Continue
Own name/brand?
Yes = Provider, No = Continue
Internal use only?
Yes = Provider, No = Provider
Significant modification?
You can change from deployer to provider!
Provider Obligations
What must a provider do for high-risk AI?
As a provider of a high-risk AI system, you have extensive obligations. You must implement a risk management system that remains active throughout the lifecycle. Technical documentation must be prepared before market introduction. You are responsible for conformity assessment, CE marking and registration in the EU database. After launch, you must perform post-market monitoring and report serious incidents.
Risk Management System
Continuous system throughout entire lifecycle
Technical Documentation
Extensive docs before market introduction
Conformity Assessment
CE marking and EU database registration
Post-market Monitoring
Active monitoring and incident reporting
Deployer Obligations
What must a deployer do?
As a deployer of a high-risk AI system, you have fewer obligations than a provider, but they are no less important. You must carefully follow the provider's instructions. For public organizations and certain high-risk applications, you must perform a Fundamental Rights Impact Assessment (FRIA). Meaningful human oversight is mandatory, as is monitoring input data. Serious incidents must be reported.
Follow Instructions
Follow provider instructions carefully
Perform FRIA
Fundamental Rights Impact Assessment
Human Oversight
Arrange meaningful human oversight
Report Incidents
Serious incidents within 15 days
Special Cases
When it gets complicated
There are situations where your role is not immediately clear. If you buy an AI system, significantly modify it, and market it under your own name, you are both deployer (of the original system) and provider (of the modified system). If you integrate General Purpose AI like ChatGPT into your own system, you may become provider of the combined system. Importers and distributors of AI systems from outside the EU have special obligations under Articles 26 and 27.
Dual Role
You can be both provider and deployer
GPAI Integration
Integrating ChatGPT = possibly becoming provider
Import from outside EU
Special obligations for importers
Distributors
Article 27 obligations apply
Frequently Asked Questions
Answers to the most common questions about the EU AI Act
Ready to get started?
Discover how we can help your organization with EU AI Act compliance.