Responsible AI Platform
AI Providers & GPAI

AI Act Compliance for Technology & Software

GPAI, SaaS and AI-as-a-Service — provider obligations under the AI Act

Practical guidelines for technology companies, software developers and AI providers to comply with the EU AI Act.

View the compliance checklist

Why Take Action Now?

The AI Act has major impact on technology and software companies

August 2025

First obligations for GPAI models and AI providers come into effect

Provider = Primary Responsibility

As a developer of AI systems, you bear the heaviest obligations under the AI Act

Fines up to €35 million

Or 7% of global annual turnover — the EU AI Office will enforce GPAI rules

Technical Documentation

Extensive documentation on training data, model architecture and evaluation results required

AI Applications under the AI Act

These AI applications fall under specific AI Act obligations

General Purpose AI (GPAI)

Foundation models such as LLMs, multimodal models and generative AI — specific GPAI obligations including transparency and copyright compliance.

Large Language ModelsMultimodal AIText-to-image modelsCode generation AI

SaaS with Embedded AI

Software-as-a-Service products with integrated AI functionality — provider responsibility for the AI component.

CRM with AI scoringHR software with matchingAnalytics platformsAutomated decision tools

AI-as-a-Service (AIaaS)

API-based AI services integrated by third parties — shared responsibility between provider and deployer.

AI API servicesML model hostingComputer vision APIsNLP-as-a-Service

Developer Tools & MLOps

Platforms for building, training and deploying AI models — responsibility in the AI supply chain.

ML platformsAutoML toolsModel monitoringFeature stores

Specific Challenges for Technology Companies

The AI Act brings unique compliance questions for the technology sector

Provider vs Deployer Role Division

As a tech company, you are often both provider and deployer. How do you split responsibilities in the value chain?

GPAI Compliance (Articles 51-56)

New rules specifically for general purpose AI. Technical documentation, copyright compliance and transparency obligations.

Open Source Exceptions

When does the open source exception (Art. 2(12)) apply? And what obligations remain for open source AI?

Systemic Risk Classification

GPAI models with "systemic risk" (>10²⁵ FLOPs) have additional obligations. How do you determine if your model qualifies?

Downstream Usage

Your AI is deployed by third parties — potentially in high-risk contexts. How do you limit liability?

AI Office Expectations

The EU AI Office oversees GPAI. What guidance and codes of practice are expected?

AI Act Compliance Roadmap

Practical steps for technology companies

1

AI Product Inventory

2-4 weeks

Map all AI products and services. What models, APIs and embedded AI do you offer?

2

Role & Risk Classification

1-2 weeks

Determine per product your role (provider/deployer/distributor) and the risk level.

3

Gap Analysis

3-6 weeks

Compare current technical documentation, testing and monitoring with AI Act requirements.

4

Remediation

3-12 months

Implement model cards, technical documentation, bias testing, red teaming and monitoring.

5

Ongoing Compliance

Ongoing

Set up processes for model updates, incident reporting and collaboration with downstream deployers.

15-month trajectory

Implementation Roadmap

Detailed 6-phase timeline with concrete deliverables

1

Inventory

Month 1-2
Complete AI product registerProvider/deployer role per productGPAI model inventory
2

Classification

Month 2-3
GPAI vs. high-risk vs. limited risk per systemSystemic risk assessmentOpen source analysis
3

Gap Analysis

Month 3-5
Per AI product: gap between current docs and AI Act/GPAI requirementsAnnex IV compliance check
4

Governance & Policy

Month 5-7
AI governance structureProvider-deployer contractsIncident response procedures
5

Technical Implementation

Month 7-12
Model cards & technical docsCopyright compliance pipelineMonitoring & logging systemsRed teaming framework
6

AI Office Ready

Month 12-15
Internal audit on GPAI obligationsDry-run for AI OfficeContinuous compliance monitoring

AI System Inventory Guide

Typical AI products in the technology sector and their likely classification

Important: The provider/deployer role determines your obligations. If you build AI for others, you are almost always a provider with full obligations.

Foundation Models (GPAI)

GPAI obligations
Large Language ModelsMultimodal modelsText-to-imageCode generation

Art. 51-56 — always GPAI obligations, regardless of downstream use

GPAI with Systemic Risk

Systemic risk
Models >10²⁵ FLOPsBroadly deployed modelsHigh-impact models

Additional obligations: red teaming, incident reporting, cybersecurity evaluation

SaaS with AI Components

Context-dependent
CRM with AI scoringAnalytics platformsPredictive toolsRecommendation engines

Classification depends on application area — high-risk if it falls under Annex III

AI API Services

Provider + downstream risk
Computer vision APINLP-as-a-ServiceSpeech-to-textSentiment analysis

You are provider of the AI component, deployer co-determines the risk level

Open Source AI

Limited exception
Open source modelsOpen weightsCommunity modelsFine-tuning tools

Art. 2(12) exception does NOT apply to GPAI with systemic risk or high-risk applications

Internal Tools & MLOps

Usually minimal risk
ML platformsFeature storesModel monitoringCI/CD for AI

Minimal risk unless it directly produces AI products for third parties

Classification Decision Tree

Quickly determine the classification of your AI product

Do you offer an AI model that can be used for diverse purposes (general purpose)?

Yes

GPAI obligations (Art. 51-56)

No

Go to next question

Is your AI system deployed in a high-risk context (Annex III)?

Yes

High-risk provider obligations

No

Go to next question

Does your system generate content (text, image, audio) or interact directly with users?

Yes

Limited risk — transparency obligations (Art. 50)

No

Go to next question

Is it purely an internal tool without direct impact on end users?

Yes

Minimal risk — only AI literacy required

No

Consult an expert for classification

This is a simplified decision tree. Fine-tuning GPAI may give you provider status. Seek legal advice.

Governance Structure

Recommended organizational structure for AI governance in technology companies

CTO / VP Engineering
AI Governance Board (Product, Legal, Engineering, Security)
AI Compliance Lead per product
Responsible AI Team
Security & Red Teaming
Legal & Regulatory Affairs

Integrate AI governance into your existing development lifecycle (SDLC) — make it part of your CI/CD pipeline, not a separate process.

Key Roles

AI Product Owner

Responsible per AI product for compliance, documentation and downstream communication

AI Compliance Officer

Overall monitoring of AI Act and GPAI obligations across all products

ML Engineering Lead

Technical implementation of model cards, logging, bias testing and red teaming

AI Literacy Coordinator

Ensures AI literacy of employees — mandatory for all AI providers (Art. 4)

Compliance Checklist for AI Providers

Concrete checkpoints per AI product

Provider/deployer role per product establishedArt. 3
GPAI model registered with AI OfficeArt. 49
Technical documentation per Annex IVArt. 53
Copyright compliance policy establishedArt. 53(1)(c)
Transparency information for downstream providersArt. 53(1)(b)
Risk management system established (high-risk)Art. 9
Data governance & training data documentationArt. 10
Logging & monitoring in placeArt. 12
AI literacy of employees ensuredArt. 4
Red teaming conducted (systemic risk GPAI)Art. 55
Incident response procedure establishedArt. 62

This checklist applies per AI product. GPAI providers have additional obligations on top of standard provider requirements.

Common Mistakes to Avoid

Avoid these pitfalls in AI Act implementation

Thinking you are "just a deployer"

If you build AI for others, you are a provider. Fine-tuning GPAI can also make you a provider.

Open source = no obligations

The open source exception is limited. GPAI with systemic risk and high-risk applications are excluded.

Postponing technical documentation

Annex IV requires extensive docs on architecture, training data and evaluation. Start with model cards now.

Ignoring downstream usage

You are co-responsible if your AI is used in high-risk contexts. Document intended use clearly.

Forgetting AI literacy

Art. 4 mandates AI literacy for all employees working with AI. This applies from August 2025.

Waiting for codes of practice

GPAI obligations already apply. Start compliance now — codes of practice refine but do not replace.

What Makes Technology AI Different?

Sector-specific considerations

Provider Obligations

Tech companies bear the heaviest compliance burden as AI providers under the AI Act

GPAI Regulation

Specific rules for foundation models that no other sector has — including transparency about training data

Supply Chain Responsibility

Your AI is deployed downstream — you are co-responsible for the entire chain

Open Source Nuances

Open source AI has limited exceptions, but not for GPAI with systemic risk

Tech regulation

Regulatory Overlap

How the AI Act connects with other tech regulation

Digital Markets Act (DMA)

Overlap: Gatekeepers, interoperability, algorithm transparency

Practical tip: DMA obligations for gatekeepers overlap with AI Act transparency requirements — combine reporting

Digital Services Act (DSA)

Overlap: Algorithmic recommender systems, risk assessment

Practical tip: DSA already requires transparency on recommender systems — AI Act adds technical documentation requirements

Cyber Resilience Act (CRA)

Overlap: Software security, vulnerability management, CE marking

Practical tip: CRA and AI Act both require CE marking — coordinate conformity assessments

GDPR

Overlap: Training data, automated decision-making (Art. 22), DPIA

Practical tip: FRIA can partially overlap with DPIA — combine where possible for efficiency

NIS2 Directive

Overlap: Cybersecurity, incident reporting, supply chain security

Practical tip: NIS2 incident reporting aligns with AI Act incident obligations for systemic risk GPAI

Ready to Start AI Act Compliance?

Practical tools and guidance for technology companies

Free 30-minute orientation call

or

Practical updates on GPAI, provider obligations and AI Office guidance