Reg (EU) 2024/1689Generate dossier — €249
LIVE — Fines tracker · Obligations calendar · Transposition status — Updated weekly from EUR-Lex, Safety Gate, OEIL and 12 official sourcesView regulatory intelligence →

AI Act Article 17: the 13 aspects of the quality management system every high-risk AI provider must have in place.

Article 17 of Regulation (EU) 2024/1689 requires every provider of high-risk AI to put in place a quality management system that ensures compliance with the Regulation. The QMS must be documented in a systematic and orderly manner — written policies, procedures and instructions — and cover at least the 13 aspects listed in Art. 17(1) points (a) to (m). Implementation must be proportionate to the provider's size, but the degree of rigour required to ensure compliance is fixed. Financial institutions subject to Union financial-services internal-governance rules may rely on those frameworks under Art. 17(4) — but only for points (g), (h) and (i). The QMS documentation must be retained for 10 years under Art. 18(1).

Generate AI Act dossier — €249Free: check your AI system risk

€249 one-time payment · 12 PDF documents in ZIP · 45 minutes · 100% in your browser

Regulation (EU) 2024/1689 · Article 11 + Annex IV · 12 documents · 100% browser-side — your data never leaves your machine

The numbers

13 aspects
Art. 17(1) — points (a) to (m). Each must be documented in writing as policies, procedures or instructions.
Art. 17(4)
Financial institutions may rely on internal-governance frameworks under Union financial-services law for points (g), (h), (i) only.
€15M / 3%
Art. 99(4)(a). Penalty tier for breach of Art. 16 obligations, which includes the Art. 17 QMS.

The 13 QMS aspects, each linked to its corresponding article

Art. 17(1) lists 13 aspects in order. Read in pairs with the article each one operationalises — the QMS is the engine that makes the rest of Chapter III work.

a
17(1)(a) — Strategy for regulatory compliance
Including compliance with conformity-assessment procedures and procedures for the management of modifications to the high-risk AI system.
b
17(1)(b) — Design control and verification
Techniques, procedures and systematic actions used for the design, design control and design verification of the high-risk AI system.
c
17(1)(c) — Development quality control and assurance
Techniques, procedures and systematic actions used for the development, quality control and quality assurance of the high-risk AI system.
d
17(1)(d) — Examination, test and validation procedures
Procedures to be carried out before, during and after the development of the high-risk AI system, and the frequency with which they have to be carried out.
e
17(1)(e) — Technical specifications and standards
Technical specifications, including standards, to be applied. Where the relevant harmonised standards are not applied in full or do not cover all the Section 2 requirements, the means to ensure compliance must be specified.
f
17(1)(f) — Data management
Systems and procedures for data management — acquisition, collection, analysis, labelling, storage, filtration, mining, aggregation, retention — and any other operation on data performed before and for placing on the market.
g
17(1)(g) — Risk management system (Art. 9)
The risk management system referred to in Art. 9 — iterative, continuous process. For financial institutions: covered by Art. 17(4) carve-out via existing internal-governance arrangements.
h
17(1)(h) — Post-market monitoring (Art. 72)
Setting-up, implementation and maintenance of a post-market monitoring system in accordance with Art. 72. Financial-services carve-out under Art. 17(4) applies.
i
17(1)(i) — Serious incident reporting (Art. 73)
Procedures related to the reporting of a serious incident in accordance with Art. 73 — 15-day general, 10-day death, 2-day widespread. Financial-services carve-out under Art. 17(4) applies.
j
17(1)(j) — Communication with authorities and stakeholders
Handling of communication with national competent authorities, other relevant authorities including those providing or supporting access to data, notified bodies, other operators, customers or interested parties.
k
17(1)(k) — Record-keeping
Systems and procedures for record-keeping of all relevant documentation and information.
l
17(1)(l) — Resource management
Resource management, including security-of-supply related measures.
m
17(1)(m) — Accountability framework
An accountability framework setting out the responsibilities of the management and other staff with regard to all the aspects listed in this paragraph.

Three common mistakes

COMMON MISTAKE

"We have ISO 9001 / ISO 42001 — the QMS obligation is covered"

Art. 17(3) allows providers subject to sectoral QMS obligations under Union law to include the Art. 17(1) aspects as part of their existing QMS. But the substantive 13 aspects must be addressed. ISO 9001 is a generic quality framework, not a substitute for the AI-specific aspects under Art. 17(1)(f), (g), (h), (i) (data management, AI risk management, post-market monitoring of AI, AI incident reporting). ISO 42001 is closer but does not automatically satisfy Art. 17.

COMMON MISTAKE

"QMS = documentation alone"

Art. 17(1) requires the QMS to be documented "in the form of written policies, procedures and instructions" — documentation is the form, but the substance is operational systems, procedures and behaviours. An audit will look for evidence that the procedures are followed (records, training logs, incident handling, change management). A binder full of policies that nobody applies is not a QMS.

COMMON MISTAKE

"Financial institutions are exempt from Art. 17"

Art. 17(4) is narrower than that. Providers that are financial institutions subject to internal-governance, arrangements or processes requirements under Union financial-services law have the obligation to put in place a QMS — with the exception of paragraph 1 points (g), (h) and (i) — deemed fulfilled by complying with internal-governance rules. Points (a)–(f) and (j)–(m) of Art. 17(1) still apply on their own terms.

Does the AI Act apply to your system?

Answer these four questions to determine your obligations.

Does your system use machine learning, logic-based, or statistical approaches?
Art. 3(1) — definition of "AI system"
Is the system placed on the EU market or does its output affect persons in the EU?
Art. 2(1) — territorial scope (extraterritorial via 2(1)(c))
Is your system used in any Annex III domain? (employment, credit, education, law enforcement, migration, justice, critical infrastructure, biometrics)
Art. 6(2) + Annex III — high-risk classification
Are you the provider (developer) or the deployer (user) of the system?
Art. 3(3) provider · Art. 3(4) deployer — different obligations

Take the full AI Act risk classification test →

What the ZIP contains

12 PDF documents generated from your inputs. Each cites the article of Regulation (EU) 2024/1689 it fulfils.

1

Risk Classification Report

Identifies whether your system is prohibited (Art. 5), high-risk (Art. 6 + Annex III) or subject to transparency obligations (Art. 50).

2

Technical Documentation

The 9 blocks of Annex IV in full: system description, training data, validation, performance metrics, risk management, human oversight. Art. 11 + Annex IV.

3

EU Declaration of Conformity

Signable document conforming to Art. 47 and Annex V.

4

Compliance Calendar

Key application dates: 2 Feb 2025, 2 Aug 2025, 2 Aug 2026, 2 Aug 2027. Art. 113.

5

Conformity Sheet

Executive summary of compliance status for authorities or commercial buyers. Art. 43 procedure.

6

Quality Management System (QMS)

QMS structure covering the 13 aspects required by Art. 17.

7

Deployer Instructions

Document for the entity deploying your system, conforming to Art. 13.

8

Evidence Checklist

Verifiable evidence list, cross-referenced to every Annex IV block.

9

Incident Report Template

Notification protocol conforming to Art. 73 (15 days general / 10 days death / 2 days widespread).

10

AI Literacy Programme

Training plan conforming to Art. 4, in force since 2 February 2025.

11

Post-Market Monitoring Plan

Plan structure required by Art. 72 and integrated into the technical documentation under Annex IV(9).

12

Fundamental Rights Impact Assessment (FRIA)

Template under Art. 27 for public bodies, private entities providing public services, and Annex III 5(b)(c) deployers.

See before you buy — Download a sample dossier (PDF, fictional company) — Real structure, real articles, real format. Fictional data.

Generated from your inputs, in your browser. No data leaves your machine.

What you pay

🧾 AI ACT COMPLIANCE CONSULTANCY
€5,000–€15,000
3–6 months. They explain the obligations to you.
✓ AICHECK
€249
12 documents. 45 minutes. Solves the documentation.

Technical documentation and conformity assessment: two layers

● LAYER 1

Technical documentation — Annex IV

12 documents. 45 minutes. €249. The documentation your system needs before being placed on the market.

∅ LAYER 2

Conformity assessment by notified body

If your system falls under Art. 43(1) (Annex III point 1 biometrics with notified-body route, or Annex I products), you will need third-party conformity assessment. That is a separate process. AICheck does not replace it.

We do not sell audits. We do not sell consultancy. We sell the tool that structures your documentation under Annex IV.

Penalty regime

Article 99 of Regulation (EU) 2024/1689. Chapter XII (Penalties) applies from 2 August 2025.

🇪🇺
Non-compliance with prohibited practices (Art. 5)
€35M / 7%

Art. 99(3). Up to €35 million or 7% of total worldwide annual turnover, whichever is higher. For SMEs and start-ups: whichever is lower (Art. 99(6)).

🇪🇺
Non-compliance with operator obligations (high-risk, transparency, deployer)
€15M / 3%

Art. 99(4). Includes failure to draw up technical documentation under Art. 11 + Annex IV. Covers obligations of providers (Art. 16), deployers (Art. 26), authorised representatives (Art. 22), importers (Art. 23), distributors (Art. 24), notified bodies (Art. 31, 33, 34) and transparency under Art. 50.

🇪🇺
Supply of incorrect, incomplete or misleading information
€7.5M / 1%

Art. 99(5). Applies when information provided to notified bodies or national competent authorities is wrong or misleading.

Documenting 5 or more AI systems?

If you operate multiple AI systems and need to document them all under Annex IV, contact us for volume pricing at hello@solidwaretools.com.

Request volume pricing
Reply within one business day

What AICheck guarantees, and what it does not

AICheck produces a document structured under Article 11 and Annex IV of Regulation (EU) 2024/1689 from the information you provide. The accuracy, truthfulness and completeness of that information is your responsibility as provider of the AI system.

We guarantee that the document structure follows Article 11 and Annex IV of Regulation (EU) 2024/1689 and that the legal references cited are correct as of the last verification date. We do not guarantee that a specific document will be accepted by a market surveillance authority in a given case, nor by a commercial buyer in a procurement process.

AICheck is not legal advice. For specific situations, consult a lawyer or specialised regulatory consultancy.

Frequently asked questions

Can my ISO 9001 / ISO 42001 certification satisfy Article 17?
Partially. Art. 17(3) allows you to include the Art. 17(1) aspects within an existing QMS framework. But the AI-specific aspects — particularly 17(1)(f) data management, 17(1)(g) risk management under Art. 9, 17(1)(h) post-market monitoring under Art. 72 and 17(1)(i) serious-incident reporting under Art. 73 — must be addressed substantively. ISO 42001 covers more of the AI-specific ground than ISO 9001 but does not automatically discharge every Art. 17 obligation.
How does the financial-sector carve-out work?
Art. 17(4) — financial institutions subject to Union financial-services law on internal governance, arrangements or processes are deemed to satisfy the obligation to put in place a QMS, with the exception of points (g), (h) and (i) of Art. 17(1). In practice: risk management, post-market monitoring and serious-incident reporting must still be implemented under the AI Act regime; the rest can be carried within the institution's existing governance framework. Any Art. 40 harmonised standards must be taken into account.
Is the QMS size-proportionate?
Art. 17(2) — implementation of the Art. 17(1) aspects shall be proportionate to the size of the provider's organisation. But the same paragraph adds: providers shall "in any event respect the degree of rigour and the level of protection required to ensure compliance of their high-risk AI systems with this Regulation". Proportionality affects the implementation depth, not the substantive scope of the 13 aspects.
Is this a subscription?
No. One-time payment. The licence includes 30 days of editing and 10 regenerations. The PDF you download is yours to keep.
Can I request a refund?
Pursuant to Article 16(m) of Directive (EU) 2011/83 on consumer rights, by activating the licence you give express consent to the immediate generation of digital content, waiving the 14-day withdrawal right. Refunds are only accepted in the case of a reproducible technical failure.
What if the regulation changes?
If the regulation changes while your licence is active, you can regenerate the document with the updated version of the generator at no additional cost.
⚠️ Important notice: AICheck is a documentary self-assessment tool, not legal advice nor a third-party audit. The document under Article 11 and Annex IV of Regulation (EU) 2024/1689 is generated from the data you input. The accuracy of that data is your responsibility. AICheck does not replace a qualified professional assessment.

Don't wait for the consultancy. Generate the Annex IV documentation for your AI system in your browser in 45 minutes.

Twelve documents. Annex IV fully structured. Regulation (EU) 2024/1689. Your data does not leave your machine. The ZIP you download is yours to keep.

€249 one-time payment
12 professional documents · 45 minutes · No subscription · 100% in your browser
Generate dossier — €249
✓ Last regulatory verification: 11 May 2026 · No substantive changes detected · View history