Why Take Action Now?
The AI Act has major impact on technology and software companies
August 2025
First obligations for GPAI models and AI providers come into effect
Provider = Primary Responsibility
As a developer of AI systems, you bear the heaviest obligations under the AI Act
Fines up to €35 million
Or 7% of global annual turnover — the EU AI Office will enforce GPAI rules
Technical Documentation
Extensive documentation on training data, model architecture and evaluation results required
AI Applications under the AI Act
These AI applications fall under specific AI Act obligations
General Purpose AI (GPAI)
Foundation models such as LLMs, multimodal models and generative AI — specific GPAI obligations including transparency and copyright compliance.
SaaS with Embedded AI
Software-as-a-Service products with integrated AI functionality — provider responsibility for the AI component.
AI-as-a-Service (AIaaS)
API-based AI services integrated by third parties — shared responsibility between provider and deployer.
Developer Tools & MLOps
Platforms for building, training and deploying AI models — responsibility in the AI supply chain.
Specific Challenges for Technology Companies
The AI Act brings unique compliance questions for the technology sector
Provider vs Deployer Role Division
As a tech company, you are often both provider and deployer. How do you split responsibilities in the value chain?
GPAI Compliance (Articles 51-56)
New rules specifically for general purpose AI. Technical documentation, copyright compliance and transparency obligations.
Open Source Exceptions
When does the open source exception (Art. 2(12)) apply? And what obligations remain for open source AI?
Systemic Risk Classification
GPAI models with "systemic risk" (>10²⁵ FLOPs) have additional obligations. How do you determine if your model qualifies?
Downstream Usage
Your AI is deployed by third parties — potentially in high-risk contexts. How do you limit liability?
AI Office Expectations
The EU AI Office oversees GPAI. What guidance and codes of practice are expected?
AI Act Compliance Roadmap
Practical steps for technology companies
AI Product Inventory
2-4 weeksMap all AI products and services. What models, APIs and embedded AI do you offer?
Role & Risk Classification
1-2 weeksDetermine per product your role (provider/deployer/distributor) and the risk level.
Gap Analysis
3-6 weeksCompare current technical documentation, testing and monitoring with AI Act requirements.
Remediation
3-12 monthsImplement model cards, technical documentation, bias testing, red teaming and monitoring.
Ongoing Compliance
OngoingSet up processes for model updates, incident reporting and collaboration with downstream deployers.
Implementation Roadmap
Detailed 6-phase timeline with concrete deliverables
Phase 1.Inventory
Month 1-2Phase 2.Classification
Month 2-3Phase 3.Gap Analysis
Month 3-5Phase 4.Governance & Policy
Month 5-7Phase 5.Technical Implementation
Month 7-12Phase 6.AI Office Ready
Month 12-15AI System Inventory Guide
Typical AI products in the technology sector and their likely classification
Important: The provider/deployer role determines your obligations. If you build AI for others, you are almost always a provider with full obligations.
Foundation Models (GPAI)
GPAI obligationsArt. 51-56 — always GPAI obligations, regardless of downstream use
GPAI with Systemic Risk
Systemic riskAdditional obligations: red teaming, incident reporting, cybersecurity evaluation
SaaS with AI Components
Context-dependentClassification depends on application area — high-risk if it falls under Annex III
AI API Services
Provider + downstream riskYou are provider of the AI component, deployer co-determines the risk level
Open Source AI
Limited exceptionArt. 2(12) exception does NOT apply to GPAI with systemic risk or high-risk applications
Internal Tools & MLOps
Usually minimal riskMinimal risk unless it directly produces AI products for third parties
Classification Decision Tree
Quickly determine the classification of your AI product
Do you offer an AI model that can be used for diverse purposes (general purpose)?
GPAI obligations (Art. 51-56)
Go to next question
Is your AI system deployed in a high-risk context (Annex III)?
High-risk provider obligations
Go to next question
Does your system generate content (text, image, audio) or interact directly with users?
Limited risk — transparency obligations (Art. 50)
Go to next question
Is it purely an internal tool without direct impact on end users?
Minimal risk — only AI literacy required
Consult an expert for classification
This is a simplified decision tree. Fine-tuning GPAI may give you provider status. Seek legal advice.
Governance Structure
Recommended organizational structure for AI governance in technology companies
Integrate AI governance into your existing development lifecycle (SDLC) — make it part of your CI/CD pipeline, not a separate process.
Key Roles
AI Product Owner
Responsible per AI product for compliance, documentation and downstream communication
AI Compliance Officer
Overall monitoring of AI Act and GPAI obligations across all products
ML Engineering Lead
Technical implementation of model cards, logging, bias testing and red teaming
AI Literacy Coordinator
Ensures AI literacy of employees — mandatory for all AI providers (Art. 4)
Compliance Checklist for AI Providers
Concrete checkpoints per AI product
This checklist applies per AI product. GPAI providers have additional obligations on top of standard provider requirements.
Common Mistakes to Avoid
Avoid these pitfalls in AI Act implementation
Thinking you are "just a deployer"
If you build AI for others, you are a provider. Fine-tuning GPAI can also make you a provider.
Open source = no obligations
The open source exception is limited. GPAI with systemic risk and high-risk applications are excluded.
Postponing technical documentation
Annex IV requires extensive docs on architecture, training data and evaluation. Start with model cards now.
Ignoring downstream usage
You are co-responsible if your AI is used in high-risk contexts. Document intended use clearly.
Forgetting AI literacy
Art. 4 mandates AI literacy for all employees working with AI. This applies from August 2025.
Waiting for codes of practice
GPAI obligations already apply. Start compliance now — codes of practice refine but do not replace.
What Makes Technology AI Different?
Sector-specific considerations
Provider Obligations
Tech companies bear the heaviest compliance burden as AI providers under the AI Act
GPAI Regulation
Specific rules for foundation models that no other sector has — including transparency about training data
Supply Chain Responsibility
Your AI is deployed downstream — you are co-responsible for the entire chain
Open Source Nuances
Open source AI has limited exceptions, but not for GPAI with systemic risk
Regulatory Overlap
How the AI Act connects with other tech regulation
Digital Markets Act (DMA)
Overlap: Gatekeepers, interoperability, algorithm transparency
Practical tip: DMA obligations for gatekeepers overlap with AI Act transparency requirements — combine reporting
Digital Services Act (DSA)
Overlap: Algorithmic recommender systems, risk assessment
Practical tip: DSA already requires transparency on recommender systems — AI Act adds technical documentation requirements
Cyber Resilience Act (CRA)
Overlap: Software security, vulnerability management, CE marking
Practical tip: CRA and AI Act both require CE marking — coordinate conformity assessments
GDPR
Overlap: Training data, automated decision-making (Art. 22), DPIA
Practical tip: FRIA can partially overlap with DPIA — combine where possible for efficiency
NIS2 Directive
Overlap: Cybersecurity, incident reporting, supply chain security
Practical tip: NIS2 incident reporting aligns with AI Act incident obligations for systemic risk GPAI
Related Articles
Deepen your knowledge of AI Act compliance for technology & software
FRIA: Complete Guide to Article 27 AI Act
Everything about the mandatory fundamental rights impact assessment for high-risk AI systems.
AI Agents: The Governance Challenge
How organizations responsibly deploy and manage AI agents under the AI Act.
Dutch DPA Warns of Security Risks with AI Agents
The Dutch Data Protection Authority warns about security risks with AI agents.
Ready to Start AI Act Compliance?
Practical tools and guidance for technology companies