Responsible AI Platform
AI Providers & GPAI

AI Act Compliance for Technology & Software

GPAI, SaaS and AI-as-a-Service — provider obligations under the AI Act

Practical guidelines for technology companies, software developers and AI providers to comply with the EU AI Act.

View the compliance checklist

Why Take Action Now?

The AI Act has major impact on technology and software companies

August 2025

First obligations for GPAI models and AI providers come into effect

Provider = Primary Responsibility

As a developer of AI systems, you bear the heaviest obligations under the AI Act

Fines up to €35 million

Or 7% of global annual turnover — the EU AI Office will enforce GPAI rules

Technical Documentation

Extensive documentation on training data, model architecture and evaluation results required

AI Applications under the AI Act

These AI applications fall under specific AI Act obligations

General Purpose AI (GPAI)

Foundation models such as LLMs, multimodal models and generative AI — specific GPAI obligations including transparency and copyright compliance.

Large Language ModelsMultimodal AIText-to-image modelsCode generation AI

SaaS with Embedded AI

Software-as-a-Service products with integrated AI functionality — provider responsibility for the AI component.

CRM with AI scoringHR software with matchingAnalytics platformsAutomated decision tools

AI-as-a-Service (AIaaS)

API-based AI services integrated by third parties — shared responsibility between provider and deployer.

AI API servicesML model hostingComputer vision APIsNLP-as-a-Service

Developer Tools & MLOps

Platforms for building, training and deploying AI models — responsibility in the AI supply chain.

ML platformsAutoML toolsModel monitoringFeature stores

Specific Challenges for Technology Companies

The AI Act brings unique compliance questions for the technology sector

Provider vs Deployer Role Division

As a tech company, you are often both provider and deployer. How do you split responsibilities in the value chain?

GPAI Compliance (Articles 51-56)

New rules specifically for general purpose AI. Technical documentation, copyright compliance and transparency obligations.

Open Source Exceptions

When does the open source exception apply? And what obligations remain for open source AI?

Systemic Risk Classification

GPAI models with "systemic risk" have additional obligations. How do you determine if your model falls under this?

Downstream Usage

Your AI is deployed by third parties — potentially in high-risk contexts. How do you limit liability?

AI Office Expectations

The EU AI Office oversees GPAI. What guidance and codes of practice are expected?

AI Act Compliance Roadmap

Practical steps for technology companies

1

AI Product Inventory

2-4 weeks

Map all AI products and services. What models, APIs and embedded AI do you offer?

2

Role & Risk Classification

1-2 weeks

Determine per product your role (provider/deployer/distributor) and the risk level.

3

Gap Analysis

3-6 weeks

Compare current technical documentation, testing and monitoring with AI Act requirements.

4

Remediation

3-12 months

Implement model cards, technical documentation, bias testing, red teaming and monitoring.

5

Ongoing Compliance

Ongoing

Set up processes for model updates, incident reporting and collaboration with downstream deployers.

What Makes Technology AI Different?

Sector-specific considerations

Provider Obligations

Tech companies bear the heaviest compliance burden as AI providers under the AI Act

GPAI Regulation

Specific rules for foundation models that no other sector has — including transparency about training data

Supply Chain Responsibility

Your AI is deployed downstream — you are co-responsible for the entire chain

Open Source Nuances

Open source AI has limited exceptions, but not for GPAI with systemic risk

Need Help with AI Act Compliance?

We help technology companies with practical implementation

Free 30-minute orientation call

or

Updates on AI governance for technology and software