Provider vs Deployer: Understanding Your Role in the AI Value Chain
Clear analysis of how to determine your regulatory role and its compliance implications. Navigate edge cases, contractual arrangements, and shifting responsibilities.
One of the most consequential determinations you'll make under the EU AI Act isn't about your technology – it's about your role. Are you a provider or a deployer? The answer shapes your entire compliance journey, your documentation requirements, and your potential liability. Yet for many organizations, especially those in complex AI ecosystems, the answer isn't immediately clear.
Let's untangle this together, with real-world scenarios that will help you determine exactly where you stand.
Why This Classification Matters More Than You Think
The EU AI Act assigns fundamentally different obligations based on your role. Providers face the lion's share of compliance requirements – technical documentation, conformity assessments, CE marking, and post-market monitoring. Deployers have lighter obligations focused on use, monitoring, and human oversight.
Get this wrong, and you might spend months preparing for the wrong compliance requirements. Or worse, you might assume you're a deployer with minimal obligations, only to discover you're actually a provider facing the full weight of the Act.
The financial implications are significant. Provider compliance for a high-risk system typically costs €100,000-€500,000. Deployer compliance might be €10,000-€50,000. That's not a rounding error – it's a fundamental difference in resource allocation.
The Basic Definitions (In Plain English)
Provider: You develop an AI system or have it developed for you, and you place it on the market or put it into service under your name or trademark.
Deployer: You use an AI system under your authority for a professional purpose.
Sounds simple? It's not. The complexity emerges in real-world scenarios where roles blur, shift, or overlap.
The Clear Cases
Let's start with the straightforward scenarios:
You're Definitely a Provider If:
You build and sell AI software: You've developed a credit scoring AI system that banks can purchase and implement. You're the provider, the banks are deployers.
You offer AI as a service: Your company runs a cloud-based AI system for medical diagnosis. Healthcare providers upload images, you return results. You're the provider.
You develop custom AI for clients: A retailer hires you to build a customer recommendation system exclusively for them. You develop it, but it goes into service under your name/brand. You're the provider.
You significantly modify an existing AI system: You take an open-source model and substantially modify it for commercial use. Congratulations, you're now a provider.
You're Definitely a Deployer If:
You use off-the-shelf AI: Your HR department uses a third-party AI recruiting tool. You configure it for your needs but don't modify its core functionality. You're a deployer.
You integrate standard AI APIs: Your app uses Google's Vision API for image recognition. Google is the provider, you're the deployer.
You implement someone else's model: You license a fraud detection model from a vendor and run it on your infrastructure without modifications. You're a deployer.
The Complicated Middle Ground
Reality is messier than textbook definitions. Here's where organizations often struggle:
Scenario 1: The Internal Developer
Your company develops an AI system purely for internal use, never intending to market it.
The Determination: You're both provider and deployer. The Act explicitly covers systems "put into service" even if not marketed. You need to comply with provider obligations for development and deployer obligations for use.
Practical Impact: Full provider compliance required, but some documentation can be simplified since you control the entire lifecycle.
Scenario 2: The White Label Arrangement
Company A develops an AI system. Company B licenses it and offers it to customers under Company B's brand.
The Determination: Company B becomes a provider if they're placing it on the market under their name. Company A might also remain a provider if their brand is still attached.
Practical Impact: Potentially dual provider obligations. Clear contractual allocation of responsibilities is essential.
Scenario 3: The Heavy Customizer
You license a base AI model and extensively customize it for your specific industry needs.
The Determination: If modifications are "substantial" (changing the intended purpose or significantly affecting compliance), you become a provider.
Practical Impact: The threshold for "substantial modification" isn't precisely defined. Document your modifications and their impact carefully.
Scenario 4: The Platform Provider
You operate a platform where third parties can deploy their AI models to serve your customers.
The Determination: Complex. You might be:
- A deployer (using third-party AI)
- A provider (if the integrated system becomes "your" AI system)
- Both (different roles for different components)
Practical Impact: Evaluate each AI component separately. You might have different roles for different parts of your platform.
Scenario 5: The Fine-Tuner
You take a pre-trained language model and fine-tune it on your industry-specific data.
The Determination: Generally, you become a provider if:
- You substantially change its purpose
- You put it into service under your name
- The fine-tuning significantly affects safety or compliance characteristics
Practical Impact: Document the extent and impact of fine-tuning. Minor adjustments might not trigger provider status, but substantial retraining likely will.
The Edge Cases That Keep Lawyers Busy
The Subsidiary Question
A subsidiary develops AI for use across the parent company's operations.
Analysis: The subsidiary is likely a provider, even for internal group use. Corporate structure doesn't eliminate provider obligations. However, some member states might interpret this differently for purely internal use.
The Integration Puzzle
You combine multiple AI systems from different providers into a unified solution.
Analysis: You might become a provider of a "new" AI system if the integration creates substantially different functionality or risk profile. The more seamless and interdependent the integration, the more likely you're a provider.
The Research Exception
Your university research lab develops AI that companies later implement.
Analysis: Pure research is outside the Act's scope. But if you license or transfer the AI for commercial use, you might become a provider at that point. The transition from research to deployment is crucial.
The Open Source Conundrum
You significantly modify an open-source AI model for commercial deployment.
Analysis: You become a provider of the modified system. The original open-source developers have limited obligations, but your modifications make you fully responsible.
When Roles Change
Your role isn't always static. Common transitions include:
From Deployer to Provider
You start as a deployer of third-party AI but gradually modify and enhance it until it becomes substantially your own system.
Key Trigger Points:
- Changing the AI's intended purpose
- Modifying core algorithms or training
- Rebranding as your own solution
- Offering it to other organizations
From Provider to Deployer
Less common, but possible if you transfer IP rights and cease development/support responsibilities.
Dual Roles
Many organizations are both providers and deployers:
- Provider for AI you develop
- Deployer for AI you use from others
Each role requires separate compliance tracking.
The Contractual Dimension
Your contracts can't override the Act's definitions, but they're crucial for allocating responsibilities:
If You're a Provider:
- Clearly define your system's intended purpose
- Specify deployment conditions and limitations
- Establish information sharing obligations
- Define support and update responsibilities
- Clarify liability allocation
If You're a Deployer:
- Ensure providers meet their obligations
- Obtain necessary technical documentation
- Clarify modification rights and impacts
- Establish incident reporting procedures
- Define audit and inspection rights
In Uncertain Situations:
- Include role determination clauses
- Allocate compliance costs
- Establish indemnification provisions
- Define transition procedures if roles change
- Include regulatory change provisions
The Practical Determination Process
Here's a systematic approach to determine your role:
Step 1: Map Your AI Value Chain
- Identify all AI systems you interact with
- Document your relationship to each system
- Note any modifications or integrations
Step 2: Apply the Primary Test
For each AI system, ask:
- Did we develop this system?
- Is it placed on the market/put into service under our name?
- Have we substantially modified someone else's system?
Step 3: Consider Edge Cases
- Are we combining multiple systems?
- Are we changing intended purposes?
- Are we assuming responsibility through contracts?
Step 4: Document Your Determination
- Record your reasoning
- Identify grey areas
- Note where legal consultation might help
Step 5: Plan for Compliance
- List obligations for each role
- Allocate resources accordingly
- Establish governance structures
Common Misconceptions Clarified
"We're just deploying internally, so we're not a provider"
Reality: Internal deployment of self-developed AI still makes you a provider.
"We only modified 20% of the code"
Reality: Percentage of code changed doesn't determine substantial modification. Impact on functionality and compliance matters more.
"Our vendor is responsible for everything"
Reality: Deployers have independent obligations. You can't outsource all responsibility.
"We're using open source, so no one is the provider"
Reality: Someone becomes the provider when putting it into service. That might be you.
"Our contract says we're a deployer"
Reality: Contracts can't override the Act's definitions. Roles are determined by actual activities.
Sector-Specific Considerations
Healthcare
Hospitals using AI for diagnosis are typically deployers. But if you modify commercial AI for your specific protocols, you might become a provider. Integration with electronic health records could trigger provider status.
Financial Services
Banks using third-party credit scoring are deployers. But if you develop proprietary risk models, even based on vendor frameworks, you're likely a provider. Combining multiple AI systems for decision-making might create provider obligations.
Manufacturing
Using AI for quality control typically makes you a deployer. But if you develop custom vision systems for your production line, you're a provider. Modifying robots' AI for specific tasks could trigger provider status.
Technology Companies
Your role can shift rapidly. Today's deployer integration might become tomorrow's provider platform. Plan for role evolution from the start.
The Compliance Implications
If You're a Provider:
Immediate Actions:
- Start technical documentation (Annex IV)
- Establish quality management systems
- Plan for conformity assessment
- Implement post-market monitoring
Resource Requirements:
- Dedicated compliance team
- Technical documentation writers
- Quality assurance processes
- Legal and regulatory support
If You're a Deployer:
Immediate Actions:
- Ensure human oversight capabilities
- Implement monitoring procedures
- Establish input data controls
- Create usage documentation
Resource Requirements:
- Operational procedures
- Training programs
- Monitoring systems
- Incident response processes
If You're Both:
Immediate Actions:
- Separate compliance tracking
- Avoid conflating obligations
- Establish clear internal responsibilities
- Implement comprehensive governance
Looking Ahead: Preparing for Uncertainty
The AI landscape evolves rapidly. Today's clear deployer might be tomorrow's edge-case provider. Build flexibility into your compliance approach:
Document Everything
Even if you're "just" a deployer now, document your AI usage thoroughly. If your role changes, you'll have a foundation.
Build Scalable Processes
Create compliance processes that can scale up if you transition from deployer to provider.
Monitor Role Indicators
Watch for activities that might shift your role:
- Increasing modifications
- Customer offering considerations
- Integration complexity growth
- Responsibility assumptions
Maintain Flexibility
Design your compliance program to adapt as your role evolves. Rigid structures break when roles shift.
The Strategic Perspective
Your role determination isn't just about compliance – it's about strategic positioning:
Providers Can:
- Command premium prices for compliant AI
- Shape market standards
- Build competitive moats through compliance
- Control the AI value chain
Deployers Can:
- Focus on application over development
- Leverage others' compliance investments
- Maintain operational flexibility
- Reduce regulatory burden
Choose your role strategically when possible. Sometimes becoming a provider (with its obligations) offers better long-term value than remaining a deployer.
Getting Help When You Need It
Role determination can be complex. Consider legal consultation when:
- Multiple AI systems interact
- Modifications are substantial but unclear
- Contractual arrangements are complex
- Roles might shift over time
- Strategic decisions depend on role clarity
Your Next Steps
- This Week: List all AI systems you develop, use, or modify
- Next Week: Apply role determination criteria to each system
- This Month: Document determinations and identify grey areas
- Next Month: Implement role-appropriate compliance measures
Remember: your role determines your obligations, but it doesn't determine your commitment to responsible AI. Whether provider or deployer, the goal is trustworthy AI that serves users safely and effectively.
The EU AI Act's role definitions might seem like regulatory complexity, but they're actually about clarity – knowing exactly what's expected of you. Determine your role accurately, and the path forward becomes much clearer.
Understanding your role clearly helps shape an effective compliance strategy that aligns with your organization's AI activities.
The path to August 2026 is clear. With proper role understanding, you're well-positioned for successful compliance.
Ready to assess your AI system?
Use our free tool to classify your AI system under the EU AI Act and understand your compliance obligations.
Start Risk Assessment →Related Articles
High-Risk AI System Requirements: Complete Compliance Guide for the EU AI Act
Comprehensive guide to high-risk AI classification, compliance obligations, and implementation strategies. Prepare for the August 2026 deadline with detailed technical requirements and practical solutions.
EU AI Act Enters Into Force: A New Era of Responsible AI Innovation
August 1, 2024 marked a historic achievement as the EU AI Act entered into force. Discover the opportunities this framework creates and the thoughtful timeline for implementation.
General-Purpose AI Models: New Obligations Under the EU AI Act
Understanding the specific requirements for general-purpose AI models and foundation models. Learn about the two-tier compliance framework and what it means for your organization.