Reg (EU) 2024/1689Generate dossier — €249
LIVE — Fines tracker · Obligations calendar · Transposition status — Updated weekly from EUR-Lex, Safety Gate, OEIL and 12 official sourcesView regulatory intelligence →

AI Act deployer vs provider: the obligations of each role under Articles 16 and 26, and the three Article 25 cases that flip you from one to the other.

Regulation (EU) 2024/1689 distinguishes between providers (Art. 3(3)) — those who develop an AI system and place it on the market — and deployers (Art. 3(4)) — those who use the system under their authority. Each role has its own obligations: Art. 16 lists 12 provider obligations for high-risk AI; Art. 26 lists 11 deployer obligations. Article 25 sets out three cases where a deployer, distributor, importer or third party is considered a provider — and inherits the full Article 16 load. Both roles face the same penalty tier under Art. 99(4): up to €15M or 3% of worldwide turnover.

Generate AI Act dossier — €249Free: check your AI system risk

€249 one-time payment · 12 PDF documents in ZIP · 45 minutes · 100% in your browser

Regulation (EU) 2024/1689 · Article 11 + Annex IV · 12 documents · 100% browser-side — your data never leaves your machine

The numbers

12 vs 11
Art. 16 lists 12 provider obligations (a–l). Art. 26 lists 11 deployer obligations across paragraphs 1–11.
Art. 25
Three cases where deployer/distributor/importer becomes provider: rebranding, substantial modification, or repurposing into high-risk.
€15M / 3%
Same Art. 99(4) penalty tier for both roles. Plus Art. 50 transparency (50.1–2 provider, 50.3–4 deployer).

The two roles side by side — and the three Article 25 flips

The provider builds and places on the market. The deployer uses under its authority for a professional activity. The distinction is sharp on paper but blurs quickly in practice — especially when you fine-tune, repackage or rebrand.

▲ PROVIDER (ART. 16)

Provider obligations (a–l) for high-risk AI

(a) Section 2 requirements compliance · (b) name/contact identification · (c) QMS under Art. 17 · (d) keep Art. 18 documentation 10 years · (e) keep logs · (f) Art. 43 conformity assessment · (g) Art. 47 EU Declaration of Conformity · (h) Art. 48 CE marking · (i) Art. 49 registration · (j) Art. 20 corrective actions · (k) demonstrate conformity on request · (l) accessibility (Directives (EU) 2016/2102 and 2019/882).

▼ DEPLOYER (ART. 26)

Deployer obligations for high-risk AI

26(1) use as per instructions · 26(2) assign human oversight to competent persons · 26(4) control input data relevance · 26(5) monitor operation + report serious incidents · 26(6) keep logs at least 6 months · 26(7) inform workers' representatives before workplace use · 26(8) public-authority registration · 26(9) reuse Art. 13 info for Art. 35 GDPR DPIA · 26(10) ex-ante authorisation for post-remote biometric ID by law enforcement · 26(11) inform natural persons subject to Annex III decisions.

Article 25 — three cases where a deployer becomes a provider and inherits the full Article 16 load:

a
Art. 25(1)(a) — Rebranding
You put your name or trademark on a high-risk AI system already placed on the market or put into service. The original provider no longer holds Art. 16 obligations for that specific system — you do. Contractual arrangements may reallocate obligations but only as between the parties.
b
Art. 25(1)(b) — Substantial modification
You make a substantial modification to a high-risk AI system already placed on the market in such a way that it remains high-risk under Art. 6. "Substantial modification" is defined in Art. 3(23) as a change not foreseen in the initial conformity assessment that affects compliance with Section 2 requirements or modifies the intended purpose.
c
Art. 25(1)(c) — Modification of intended purpose
You modify the intended purpose of an AI system — including a general-purpose AI system — that has not been classified as high-risk, in such a way that the system becomes high-risk under Art. 6. The intended-purpose change is what makes you a provider, even without any code change.

Art. 25(2): when any of the three cases occur, the original provider no longer holds Art. 16 obligations for that specific system — but must cooperate with the new provider and make available the necessary information and reasonable technical access to enable conformity assessment, unless the original provider had clearly specified that its system was not to be transformed into a high-risk AI system. Art. 25(3): for Art. 6(1) Annex I systems, the product manufacturer is considered the provider when the AI is placed on the market under its brand.

Three common mistakes

COMMON MISTAKE

"Integrating an external model makes us a provider"

No — integrating someone else's AI system or GPAI model into your product makes you the provider of the AI system you build, not of the integrated model. Art. 25(1)(c) only flips you to provider of the upstream component if you modify the intended purpose of that component such that it becomes high-risk. Routine integration with a stated, unchanged intended purpose keeps you as deployer of that component.

COMMON MISTAKE

"Deployer obligations are lighter — we can skip them as a SaaS"

Art. 26 imposes 11 paragraphs of substantive obligations including human oversight (26.2), log retention of at least 6 months (26.6), incident reporting under Art. 73 (via 26.5), worker information (26.7), and Annex III decision-subject notification (26.11). Breaches sit in the same Art. 99(4) penalty tier as provider obligations: €15M / 3%.

COMMON MISTAKE

"Substantial modification means rewriting the model"

Art. 3(23) defines substantial modification as a change not foreseen in the initial conformity assessment AND either affecting compliance with Section 2 OR modifying intended purpose. Fine-tuning a model on your own customer data, retraining on a new domain, or changing the deployment context to a new Annex III use case can all be substantial modification — without rewriting any code.

Does the AI Act apply to your system?

Answer these four questions to determine your obligations.

Does your system use machine learning, logic-based, or statistical approaches?
Art. 3(1) — definition of "AI system"
Is the system placed on the EU market or does its output affect persons in the EU?
Art. 2(1) — territorial scope (extraterritorial via 2(1)(c))
Is your system used in any Annex III domain? (employment, credit, education, law enforcement, migration, justice, critical infrastructure, biometrics)
Art. 6(2) + Annex III — high-risk classification
Are you the provider (developer) or the deployer (user) of the system?
Art. 3(3) provider · Art. 3(4) deployer — different obligations

Take the full AI Act risk classification test →

What the ZIP contains

12 PDF documents generated from your inputs. Each cites the article of Regulation (EU) 2024/1689 it fulfils.

1

Risk Classification Report

Identifies whether your system is prohibited (Art. 5), high-risk (Art. 6 + Annex III) or subject to transparency obligations (Art. 50).

2

Technical Documentation

The 9 blocks of Annex IV in full: system description, training data, validation, performance metrics, risk management, human oversight. Art. 11 + Annex IV.

3

EU Declaration of Conformity

Signable document conforming to Art. 47 and Annex V.

4

Compliance Calendar

Key application dates: 2 Feb 2025, 2 Aug 2025, 2 Aug 2026, 2 Aug 2027. Art. 113.

5

Conformity Sheet

Executive summary of compliance status for authorities or commercial buyers. Art. 43 procedure.

6

Quality Management System (QMS)

QMS structure covering the 13 aspects required by Art. 17.

7

Deployer Instructions

Document for the entity deploying your system, conforming to Art. 13.

8

Evidence Checklist

Verifiable evidence list, cross-referenced to every Annex IV block.

9

Incident Report Template

Notification protocol conforming to Art. 73 (15 days general / 10 days death / 2 days widespread).

10

AI Literacy Programme

Training plan conforming to Art. 4, in force since 2 February 2025.

11

Post-Market Monitoring Plan

Plan structure required by Art. 72 and integrated into the technical documentation under Annex IV(9).

12

Fundamental Rights Impact Assessment (FRIA)

Template under Art. 27 for public bodies, private entities providing public services, and Annex III 5(b)(c) deployers.

See before you buy — Download a sample dossier (PDF, fictional company) — Real structure, real articles, real format. Fictional data.

Generated from your inputs, in your browser. No data leaves your machine.

What you pay

🧾 AI ACT COMPLIANCE CONSULTANCY
€5,000–€15,000
3–6 months. They explain the obligations to you.
✓ AICHECK
€249
12 documents. 45 minutes. Solves the documentation.

Technical documentation and conformity assessment: two layers

● LAYER 1

Technical documentation — Annex IV

12 documents. 45 minutes. €249. The documentation your system needs before being placed on the market.

∅ LAYER 2

Conformity assessment by notified body

If your system falls under Art. 43(1) (Annex III point 1 biometrics with notified-body route, or Annex I products), you will need third-party conformity assessment. That is a separate process. AICheck does not replace it.

We do not sell audits. We do not sell consultancy. We sell the tool that structures your documentation under Annex IV.

Penalty regime

Article 99 of Regulation (EU) 2024/1689. Chapter XII (Penalties) applies from 2 August 2025.

🇪🇺
Non-compliance with prohibited practices (Art. 5)
€35M / 7%

Art. 99(3). Up to €35 million or 7% of total worldwide annual turnover, whichever is higher. For SMEs and start-ups: whichever is lower (Art. 99(6)).

🇪🇺
Non-compliance with operator obligations (high-risk, transparency, deployer)
€15M / 3%

Art. 99(4). Includes failure to draw up technical documentation under Art. 11 + Annex IV. Covers obligations of providers (Art. 16), deployers (Art. 26), authorised representatives (Art. 22), importers (Art. 23), distributors (Art. 24), notified bodies (Art. 31, 33, 34) and transparency under Art. 50.

🇪🇺
Supply of incorrect, incomplete or misleading information
€7.5M / 1%

Art. 99(5). Applies when information provided to notified bodies or national competent authorities is wrong or misleading.

Documenting 5 or more AI systems?

If you operate multiple AI systems and need to document them all under Annex IV, contact us for volume pricing at hello@solidwaretools.com.

Request volume pricing
Reply within one business day

What AICheck guarantees, and what it does not

AICheck produces a document structured under Article 11 and Annex IV of Regulation (EU) 2024/1689 from the information you provide. The accuracy, truthfulness and completeness of that information is your responsibility as provider of the AI system.

We guarantee that the document structure follows Article 11 and Annex IV of Regulation (EU) 2024/1689 and that the legal references cited are correct as of the last verification date. We do not guarantee that a specific document will be accepted by a market surveillance authority in a given case, nor by a commercial buyer in a procurement process.

AICheck is not legal advice. For specific situations, consult a lawyer or specialised regulatory consultancy.

Frequently asked questions

Can the same entity be both provider and deployer?
Yes. If you both develop a high-risk AI system AND use it under your authority — for example, an internal AI built and used in-house — you hold both Art. 16 provider obligations and Art. 26 deployer obligations. Both sets of obligations apply concurrently. The Art. 27 FRIA can in principle reuse Art. 35 GDPR DPIA outputs (Art. 26(9)).
What is Article 25 and when does it flip me from deployer to provider?
Art. 25 sets three flip scenarios: (1)(a) rebranding — putting your name on a system; (1)(b) substantial modification of a high-risk system; (1)(c) modifying the intended purpose of any system (including GPAI) so that it becomes high-risk under Art. 6. When any occurs, the original provider no longer holds Art. 16 obligations for that specific system but must cooperate (Art. 25(2)).
How long must a deployer retain logs?
Art. 26(6) — at least 6 months, unless applicable Union or national law provides otherwise (in particular data-protection law). The retention must be appropriate to the intended purpose of the high-risk AI system. For financial institutions subject to Union financial-services law on internal governance, the logs may be kept as part of the existing documentation regime.
Is this a subscription?
No. One-time payment. The licence includes 30 days of editing and 10 regenerations. The PDF you download is yours to keep.
Can I request a refund?
Pursuant to Article 16(m) of Directive (EU) 2011/83 on consumer rights, by activating the licence you give express consent to the immediate generation of digital content, waiving the 14-day withdrawal right. Refunds are only accepted in the case of a reproducible technical failure.
What if the regulation changes?
If the regulation changes while your licence is active, you can regenerate the document with the updated version of the generator at no additional cost.
⚠️ Important notice: AICheck is a documentary self-assessment tool, not legal advice nor a third-party audit. The document under Article 11 and Annex IV of Regulation (EU) 2024/1689 is generated from the data you input. The accuracy of that data is your responsibility. AICheck does not replace a qualified professional assessment.

Don't wait for the consultancy. Generate the Annex IV documentation for your AI system in your browser in 45 minutes.

Twelve documents. Annex IV fully structured. Regulation (EU) 2024/1689. Your data does not leave your machine. The ZIP you download is yours to keep.

€249 one-time payment
12 professional documents · 45 minutes · No subscription · 100% in your browser
Generate dossier — €249
✓ Last regulatory verification: 11 May 2026 · No substantive changes detected · View history