Provider vs Deployer
The EU AI Act divides responsibility between the entity that creates an AI system and the entity that operates it. Understanding which role — or roles — applies to your organisation is the first step in any compliance programme. Getting this wrong means you either over-invest in obligations that are not yours, or under-invest and face regulatory exposure.
Definition: Any natural or legal person, public authority, agency, or other body that develops an AI system or general-purpose AI model or has an AI system or general-purpose AI model developed, and places it on the market or puts it into service under its own name or trademark, whether for payment or free of charge.
In plain English: you built it (or paid someone to build it for you) and you are putting it into the world under your name.
Definition: Any natural or legal person, public authority, agency, or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity.
In plain English: you are operating someone else's AI system in a professional context. You are the user, not the creator.
The key principle: roles are not mutually exclusive
An organisation can be both a provider and a deployer simultaneously — this is the most common scenario for companies that build AI tools for their own internal use. The Act places obligations on both roles independently. Meeting your deployer obligations does not satisfy your provider obligations, and vice versa. Both sets of requirements must be addressed in full.
Obligations at a glance
Provider obligations
Chapter III + Art. 22, 43–49, 72–73Deployer obligations
Art. 26–27, Art. 50(3)Art. 26 deployer obligations — in detail
Art. 26 sets out the deployer's statutory duties for high-risk AI systems. These obligations are non-delegable: you cannot contract them away to your provider. They apply from 2 August 2026 for Annex III systems.
Follow the instructions for use
You must operate the system strictly in accordance with the provider's Art. 13 instructions for use. Using the system for purposes beyond its intended use is both a breach of Art. 26(1) and can trigger substantial modification analysis under Art. 25.
Ensure competent human oversight
You must assign human oversight to persons who have the necessary competence, training, and authority to understand the system's outputs and intervene when appropriate. Designating a human reviewer who cannot practically intervene does not satisfy this obligation.
Monitor and report
You must actively monitor the system's operation and report serious incidents to the provider and to national market surveillance authorities. You must retain logs where technically feasible. Passive deployment — set and forget — is non-compliant.
Inform affected workers
Art. 26(7) specifically requires employers to inform workers and their representatives before deploying AI systems that affect those workers. This interacts with national labour law and works council consultation rights, which may require information and consultation prior to deployment rather than concurrent with it.
Conduct a Fundamental Rights Impact Assessment (where required)
Art. 27 requires deployers that are public bodies, entities exercising public functions, or deployers in banking, insurance, education, employment, or justice to conduct a FRIA before deploying any high-risk AI system listed in Annex III. The FRIA must assess impacted populations, fundamental rights at risk, and mitigation measures.
Art. 17 provider quality management system
Art. 17 is one of the most operationally demanding provider obligations. It requires a documented QMS that governs the AI system's entire lifecycle — from design through post-market monitoring. The QMS must be proportionate to the size and nature of the provider, but must address all of the following areas:
Compliance strategy and policies
How the organisation ensures and maintains compliance with the Act
Design and development procedures
Techniques, processes, and systematic actions during development
Testing and validation procedures
How accuracy, robustness, and bias are assessed before deployment
Data management procedures
Data governance under Art. 10: acquisition, labelling, processing, retention
Risk management system
The documented Art. 9 risk management process integrated with the QMS
Post-market monitoring plan
How operational data is collected and analysed after deployment
Incident reporting procedures
Internal and external reporting chains for serious incidents under Art. 73
Change management procedures
How updates and modifications are assessed for compliance impact
Personnel accountability records
Who is responsible for each compliance activity and their qualifications
Documentation controls
Version control, retention, and availability of all compliance documentation
Edge cases: when roles shift
Art. 25 sets out the circumstances under which a distributor, importer, or deployer takes on provider obligations. These edge cases are common and frequently misunderstood.
You are both provider and deployer
An organisation develops an AI system in-house and uses it internally for its own operations.
This is the most common edge case. Companies building and operating their own AI tools — whether for HR, credit assessment, customer triage, or any Annex III use — are simultaneously providers and deployers. You must satisfy all Chapter III provider obligations (technical documentation, risk management, QMS, CE marking where applicable) AND all Art. 26 deployer obligations. There is no reduced-obligation regime for in-house systems. Art. 2(1)(a) explicitly covers putting an AI system into service for your own use.
Fine-tuning or substantially modifying a third-party system
A company purchases a third-party AI model and fine-tunes it or modifies its intended purpose for deployment.
Under Art. 25(1)(d), a deployer who makes a substantial modification to a high-risk AI system becomes a provider of the modified system and inherits all provider obligations. A 'substantial modification' (Art. 3(23)) means a change affecting the system's compliance with the Act's requirements, or modifying the intended purpose beyond what the original provider assessed. Fine-tuning that shifts the system into a new Annex III category, or that meaningfully changes its risk profile, is likely a substantial modification. Routine parameter updates or in-scope domain adaptation within the original intended purpose generally are not.
Placing a third-party system on the market under your own brand
A company rebrands or white-labels an AI system from another provider and sells it under its own name.
Art. 25(1)(a) is unambiguous: any entity that places an AI system on the market under its own name or trademark becomes the provider for the purposes of the Act — regardless of who actually developed the system. If you white-label an AI system, you take on full provider obligations. You cannot contractually transfer those obligations back to the original developer, though contracts allocating compliance responsibilities are advisable. You must ensure you have the technical documentation and information needed from the original developer to satisfy your provider obligations.
Employers deploying AI to manage or monitor workers
An employer uses a third-party AI system for worker performance monitoring, task allocation, or recruitment screening.
The employer is unambiguously a deployer. Employment-related AI systems (performance monitoring, task allocation, promotion decisions, recruitment screening) are listed in Annex III, category 4. As a deployer, the employer must comply with Art. 26 in full — including the specific obligation at Art. 26(7) to inform affected workers and their representatives before deployment. National labour law and works council consultation rights may impose additional pre-deployment requirements. Workers cannot waive their rights under the Act by contract.
An importer or distributor becoming a provider
A company imports AI systems from outside the EU, or distributes them within the EU market.
Importers and distributors have their own obligations under Art. 23–24. They must verify that the provider has conducted a conformity assessment, prepared technical documentation, affixed CE marking, and registered the system in the EU database. Critically, Art. 25(1)(b)–(c) makes an importer or distributor the provider if they place the system on the market under their own name, or make a substantial modification. Even without becoming a full provider, importers must not place systems on the market if they know or should know the system is non-compliant.
Which am I? — Decision guide
Work through these four questions to determine your role. Note: this is a simplified guide. Complex arrangements — joint ventures, API integrations, multi-party deployments — may require legal analysis.
Did you develop the AI system, or have it developed on your behalf?
If YES
You are a provider. Continue to step 3.
If NO
Continue to step 2.
Are you placing the system on the market under your own name/trademark, or importing/distributing it?
If YES
You may become a provider under Art. 25. Seek legal advice. Also continue to step 4.
If NO
You are likely a deployer. Continue to step 4.
Are you also using the system in your own operations (not just selling/licensing it to others)?
If YES
You are both a provider AND a deployer. All Art. 9–17, Art. 43–49 obligations PLUS Art. 26 obligations apply.
If NO
You are a provider only. Art. 9–17, Art. 43–49, Art. 72–73 obligations apply.
Have you substantially modified the AI system beyond the provider's intended purpose?
If YES
You become a provider of the modified system under Art. 25(1)(d). All provider obligations apply to the modified system.
If NO
You are a deployer under Art. 26. Also check whether the system is high-risk under Annex III to determine if Art. 26 applies.
The provider–deployer relationship: what the Act requires
The Act creates a structured relationship between providers and deployers. Art. 25(1) requires providers to take all necessary steps to ensure deployers can comply with their Art. 26 obligations. In practice, this means:
Information flow
Providers must supply adequate instructions for use (Art. 13), performance data, known limitations, and system capabilities. Without this information, deployers cannot comply with Art. 26 or conduct a FRIA.
Incident coordination
When a deployer reports a serious incident to the provider, the provider must investigate and take corrective action under Art. 72–73. Contracts should clearly establish the incident reporting channel and response SLAs.
Contractual allocation
Neither party can contractually eliminate their statutory obligations, but contracts can clarify who handles what — log access, monitoring responsibilities, update deployment, and regulatory authority cooperation.
Build your role-specific compliance checklist
The Checklist Builder supports role filtering — generate a provider checklist, a deployer checklist, or a combined list for organisations that are both.
EU AI Act FAQ
30 authoritative answers covering scope, risk, and all obligations.
GDPR Intersections
How GDPR and the AI Act apply simultaneously, and where they overlap.
This guide is provided for general informational purposes and reflects the EU AI Act (Regulation (EU) 2024/1689) as in force. It does not constitute legal advice. Role determinations in complex multi-party arrangements — particularly API-based integrations, joint ventures, and open-source deployments — require case-specific legal analysis. Consult a qualified legal professional for advice on your specific situation.