Responsible AI Platform
Fundamental Guide 2025

Provider vs Deployer

The most important distinction in the EU AI Act

Understand if you are a provider or deployer. This determines all your obligations under the law.

80%
are deployers
2x
more obligations for providers
€35M
maximum fine
~10 min readLast updated: February 2026
Interactive Tool

Determine Your Role in 3 Steps

Answer a few questions about your AI system and instantly discover if you are a provider, deployer or have another role — including your specific obligations.

Takes less than 1 minute

📖

Official Definitions

What does the law say exactly?

The EU AI Act makes a fundamental distinction between providers and deployers. A provider is the one who develops or has an AI system developed and places it on the market under their own name. A deployer is an organization that uses an AI system professionally. This distinction determines which obligations apply to you.

🏭

Provider (Article 3, para 3)

Develops the AI system or markets it under own name/brand

👥

Deployer (Article 3, para 4)

Uses an AI system under own responsibility for professional purposes

🔄

Both roles possible

An organization can have different roles for different systems

📋

Role determines obligations

Providers have more and heavier obligations than deployers

Related articles

🎯

Decision Tree: Determine Your Role

In 3 questions you know the answer

Use this decision tree to determine if you are a provider or deployer. Question 1: Did you develop or have the AI system developed? If no → you are probably a deployer. If yes → go to question 2. Question 2: Do you market it under your own name/brand? If yes → you are a provider. If no → question 3. Question 3: Do you use it only internally? If yes → you are still a provider (also for internal use). Note: if you buy an AI tool, significantly modify it, and resell it under your own name, you are both deployer and provider.

1️⃣

Self-developed?

No = Deployer, Yes = Continue

2️⃣

Own name/brand?

Yes = Provider, No = Continue

3️⃣

Internal use only?

Yes = Provider, No = Provider

⚠️

Significant modification?

You can change from deployer to provider!

🏭

Provider Obligations

What must a provider do for high-risk AI?

As a provider of a high-risk AI system, you have extensive obligations. You must implement a risk management system that remains active throughout the lifecycle. Technical documentation must be prepared before market introduction. You are responsible for conformity assessment, CE marking and registration in the EU database. After launch, you must perform post-market monitoring and report serious incidents.

⚙️

Risk Management System

Continuous system throughout entire lifecycle

📝

Technical Documentation

Extensive docs before market introduction

Conformity Assessment

CE marking and EU database registration

📊

Post-market Monitoring

Active monitoring and incident reporting

Related articles

👥

Deployer Obligations

What must a deployer do?

As a deployer of a high-risk AI system, you have fewer obligations than a provider, but they are no less important. You must carefully follow the provider's instructions. For public organizations and certain high-risk applications, you must perform a Fundamental Rights Impact Assessment (FRIA). Meaningful human oversight is mandatory, as is monitoring input data. Serious incidents must be reported.

📋

Follow Instructions

Follow provider instructions carefully

⚖️

Perform FRIA

Fundamental Rights Impact Assessment

👁️

Human Oversight

Arrange meaningful human oversight

🚨

Report Incidents

Serious incidents within 15 days

Related articles

⚠️

Special Cases

When it gets complicated

There are situations where your role is not immediately clear. If you buy an AI system, significantly modify it, and market it under your own name, you are both deployer (of the original system) and provider (of the modified system). If you integrate General Purpose AI like ChatGPT into your own system, you may become provider of the combined system. Importers and distributors of AI systems from outside the EU have special obligations under Articles 26 and 27.

🔄

Dual Role

You can be both provider and deployer

🤖

GPAI Integration

Integrating ChatGPT = possibly becoming provider

🌍

Import from outside EU

Special obligations for importers

📦

Distributors

Article 27 obligations apply

🔄

Digital Omnibus Impact on Provider/Deployer Obligations

Proposed changes (November 2025)

The Digital Omnibus on AI proposal of November 19, 2025 contains multiple changes directly affecting provider and deployer obligations. For providers: the registration requirement for AI systems exempted from high-risk classification via Art. 6(3) is removed. Instead, a documented self-assessment suffices. The EDPB and EDPS warn in their Joint Opinion 1/2026 that this undermines accountability and creates an undesirable incentive to claim exemptions. For deployers: high-risk obligations are deferred to 6 or 12 months after harmonized standards become available, with an ultimate deadline of December 2027 or August 2028. The AI Office becomes exclusively responsible for supervision of AI systems based on GPAI models. Additionally, both providers and deployers would be allowed to process (sensitive) personal data for bias detection in all AI systems, not just high-risk. SME exemptions are extended to small mid-caps.

📋

Registration removed

Non-high-risk systems: self-assessment suffices.

Obligation deferral

High-risk requirements deferred until standards are available.

🔍

Bias detection expanded

Sensitive data for bias detection also in non-high-risk AI.

🏢

SMC extension

SME benefits now also for small mid-caps.

Frequently Asked Questions

Answers to the most common questions about the EU AI Act

Ready to get started?

Discover how we can help your organization with EU AI Act compliance.

500+
Professionals trained
50+
Organizations helped