Mishcon de Reya page structure
Site header
Main menu
Main content section
ai tech blockchain

Are you a 'Provider' or 'Deployer' of an AI System under the EU AI Act?

Posted on 11 June 2024

The EU AI Act ("the Act") is reshaping the landscape of artificial intelligence regulation within the European Union ("EU"), bringing with it a host of obligations and responsibilities for those involved in AI systems. Its reach extends beyond the confines of the EU, applicable to any company making an AI system available on the EU market, or delivering services using AI systems to organisations in the EU. These organisations will need to understand how it will be interpreted.  

 A critical aspect of the Act is the distinction between 'Providers' and 'Deployers' of AI systems. This differentiation is not just academic; it carries significant legal implications, and in practice, the line between the two may be blurred.

We are noticing that the traditional model of off-the-shelf AI systems is becoming less common. Businesses are increasingly seeking tailored solutions that either involve AI systems trained on their proprietary material or the development of AI systems based on algorithms and materials they provide. These custom approaches ensure better control over content and outputs, mitigating risks such as copyright or privacy breaches and the unauthorised disclosure of trade secrets or confidential information.

However, these scenarios raise questions about the classification of companies under the Act. When a company has an AI system trained on its material or engages a system integrator to develop a system based on its own algorithm, does it remain a mere 'Deployer', or does it cross the threshold for classification as a 'Provider'?

To assist in understanding designation as Provider or Deployer, it is important to note the distinctions between the two roles, the primary characteristics of which we have summarised below:

Aspect AI Provider AI Deployer
Definition Entity that develops or has an AI system developed and places it on the market or puts it into service under its own name or trademark. Entity using an AI system under its authority, except for non-professional personal use.
Development Direct involvement in or commissioning the creation and design of AI systems. Integrates and manages AI systems created by others.
Market Placement Responsible for introducing AI systems to the market. Uses AI systems within their operations without introducing them to the market.
Compliance Obligations Ensures AI systems meet safety, transparency, and accountability standards before market introduction. Ensures AI systems are used in compliance with the Act during operations and monitors performance and outcomes.
Risk Exposure Bears significant responsibilities and risks, including compliance with the full scope of the Act's requirements. Bears responsibility for verifying the Provider's compliance and the AI system's performance.

 

The distinction between the definitions of Provider and Deployer, as noted above, is crucial because the bulk of the obligations and responsibilities under the Act are shouldered by Providers.

A Provider is responsible for the market placement or service provision of a high-risk AI system, irrespective of whether they designed or developed the system. As with the General Data Protection Regulation ("GDPR") and its definitions of processor and controller the classification of an entity as a Provider or Deployer is not a matter of contractual agreement but depends on the factual circumstances.

Extensive customisation of an AI system by a company could lead to its reclassification as a Provider. For example, if a company markets a high-risk AI system under its brand (for example, attaching the company name or trademark), it may be deemed a Provider.

Similarly, integrating or customising an AI system could cause companies to tip into the Provider category if a) the original AI system is high-risk, and its customisations result in an AI system that is different from the original (i.e. it makes a "substantial modification") but the system remains high risk; or b) it modifies an AI system not originally classified as high-risk in such a way that it becomes high risk.

Ultimately, whether a company is a Provider, a Deployer, or neither, will require an assessment of the facts. Given that the Act primarily focuses on Providers' obligations, correct classification is vital for managing risk exposure in the event of regulatory challenges, as those businesses that self-assess and get it wrong risk being reclassified and subject to more onerous obligations and liabilities than they anticipated.

Reach out to a member of our team for assistance with categorisation, or visit our AI Resource Centre for more information and helpful tools (like our 'For the Attention of the Board' AI report).

 

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else