Reg (EU) 2024/1689Generate dossier — €249
LIVE — Fines tracker · Obligations calendar · Transposition status — Updated weekly from EUR-Lex, Safety Gate, OEIL and 12 official sourcesView regulatory intelligence →

AI Act Annex III: the complete list of 8 high-risk AI domains under Article 6(2) — what triggers technical documentation, QMS and conformity assessment.

Annex III to Regulation (EU) 2024/1689 lists 8 domains in which AI systems are classified as high-risk under Article 6(2). Any AI system intended for use in these areas triggers the full Chapter III, Section 2 requirements — risk management (Art. 9), data governance (Art. 10), technical documentation (Art. 11 + Annex IV), record-keeping (Art. 12), transparency to deployers (Art. 13), human oversight (Art. 14), accuracy/robustness/cybersecurity (Art. 15) — plus provider obligations under Art. 16 and conformity assessment under Art. 43. The obligations apply from 2 August 2026.

Generate AI Act dossier — €249Free: check your AI system risk

€249 one-time payment · 12 PDF documents in ZIP · 45 minutes · 100% in your browser

Regulation (EU) 2024/1689 · Article 11 + Annex IV · 12 documents · 100% browser-side — your data never leaves your machine

The numbers

8 domains
Annex III sets out 8 high-risk areas. The Commission may amend the list by delegated act under Art. 7.
2 Aug 2026
Date from which high-risk obligations apply under Art. 113.
€15M / 3%
Art. 99(4). Fines for non-compliance with high-risk obligations (Art. 16, 26, 50).

The 8 Annex III domains, in detail

Each domain has specific sub-items. A system that fits any sub-item is in scope. The list below is verbatim from the Annex III text of Regulation (EU) 2024/1689.

1
1. Biometrics (where use is permitted by law)
(a) Remote biometric identification systems — excluding biometric verification whose sole purpose is to confirm a specific person is who they claim to be. (b) Biometric categorisation according to sensitive or protected attributes. (c) Emotion recognition. Art. 43(1) allows the notified-body conformity-assessment route for Annex III point 1.
2
2. Critical infrastructure
AI systems intended to be used as safety components in the management and operation of critical digital infrastructure, road traffic, or in the supply of water, gas, heating or electricity. FRIA under Art. 27 is NOT required for Annex III point 2 deployments.
3
3. Education and vocational training
(a) Access or admission to educational and vocational institutions at all levels. (b) Evaluation of learning outcomes, including to steer the learning process. (c) Assessing the appropriate level of education. (d) Monitoring and detecting prohibited behaviour of students during tests.
4
4. Employment, workers' management and access to self-employment
(a) Recruitment or selection — targeted job ads, filtering of applications, candidate evaluation. (b) Decisions affecting terms of work-related relationships, promotion or termination, task allocation based on individual behaviour or personal traits, monitoring and evaluation of performance and behaviour.
5
5. Access to essential private and public services
(a) Evaluation of eligibility for essential public assistance benefits and services by public authorities, including healthcare — and decisions to grant, reduce, revoke, reclaim. (b) Creditworthiness evaluation or credit scoring of natural persons — EXCEPT systems used for detecting financial fraud. (c) Risk assessment and pricing in life and health insurance. (d) Emergency-call evaluation and dispatching, including triage. FRIA under Art. 27 is required for 5(a), 5(b) and 5(c) deployments by public bodies or private entities providing public services.
6
6. Law enforcement (where permitted by law)
(a) Risk assessment of victims. (b) Polygraphs and similar tools. (c) Evaluating reliability of evidence. (d) Risk of offending/re-offending not solely based on profiling. (e) Profiling of natural persons under Art. 3(4) of Directive (EU) 2016/680.
7
7. Migration, asylum and border control
(a) Polygraphs and similar tools. (b) Risk assessment (security, irregular migration, health) of persons entering. (c) Examination of applications for asylum, visa, residence permits. (d) Detecting, recognising or identifying natural persons (except travel-document verification).
8
8. Administration of justice and democratic processes
(a) Assisting a judicial authority in researching and interpreting facts and law, or applying law to concrete facts, including in alternative dispute resolution. (b) Influencing the outcome of an election or referendum or the voting behaviour of natural persons, excluding administrative/logistical tools.

Three common mistakes

COMMON MISTAKE

"Annex III is the entire scope of high-risk"

No. Art. 6(1) defines a second high-risk route: AI as a safety component of products covered by Annex I harmonisation legislation (machinery, toys, lifts, medical devices, etc.) that require third-party conformity assessment. Annex III only covers Art. 6(2). The Art. 6(1) route applies from 2 August 2027 under Art. 113(c).

COMMON MISTAKE

"Fraud-detection AI is high-risk under Annex III 5(b)"

Annex III 5(b) expressly excludes AI systems used for the purpose of detecting financial fraud from the credit-scoring scope. A pure fraud-detection model is NOT high-risk under 5(b). A hybrid system that performs both credit scoring AND fraud detection remains high-risk for the credit-scoring component.

COMMON MISTAKE

"Our internal HR screening tool is too small to be high-risk"

Annex III 4(a) covers AI used for recruitment, selection, targeted job ads, application filtering and candidate evaluation. Size, internal-only deployment, and number of decisions made are irrelevant to classification. A small internal screening tool is high-risk if it fits 4(a).

Does the AI Act apply to your system?

Answer these four questions to determine your obligations.

Does your system use machine learning, logic-based, or statistical approaches?
Art. 3(1) — definition of "AI system"
Is the system placed on the EU market or does its output affect persons in the EU?
Art. 2(1) — territorial scope (extraterritorial via 2(1)(c))
Is your system used in any Annex III domain? (employment, credit, education, law enforcement, migration, justice, critical infrastructure, biometrics)
Art. 6(2) + Annex III — high-risk classification
Are you the provider (developer) or the deployer (user) of the system?
Art. 3(3) provider · Art. 3(4) deployer — different obligations

Take the full AI Act risk classification test →

What the ZIP contains

12 PDF documents generated from your inputs. Each cites the article of Regulation (EU) 2024/1689 it fulfils.

1

Risk Classification Report

Identifies whether your system is prohibited (Art. 5), high-risk (Art. 6 + Annex III) or subject to transparency obligations (Art. 50).

2

Technical Documentation

The 9 blocks of Annex IV in full: system description, training data, validation, performance metrics, risk management, human oversight. Art. 11 + Annex IV.

3

EU Declaration of Conformity

Signable document conforming to Art. 47 and Annex V.

4

Compliance Calendar

Key application dates: 2 Feb 2025, 2 Aug 2025, 2 Aug 2026, 2 Aug 2027. Art. 113.

5

Conformity Sheet

Executive summary of compliance status for authorities or commercial buyers. Art. 43 procedure.

6

Quality Management System (QMS)

QMS structure covering the 13 aspects required by Art. 17.

7

Deployer Instructions

Document for the entity deploying your system, conforming to Art. 13.

8

Evidence Checklist

Verifiable evidence list, cross-referenced to every Annex IV block.

9

Incident Report Template

Notification protocol conforming to Art. 73 (15 days general / 10 days death / 2 days widespread).

10

AI Literacy Programme

Training plan conforming to Art. 4, in force since 2 February 2025.

11

Post-Market Monitoring Plan

Plan structure required by Art. 72 and integrated into the technical documentation under Annex IV(9).

12

Fundamental Rights Impact Assessment (FRIA)

Template under Art. 27 for public bodies, private entities providing public services, and Annex III 5(b)(c) deployers.

See before you buy — Download a sample dossier (PDF, fictional company) — Real structure, real articles, real format. Fictional data.

Generated from your inputs, in your browser. No data leaves your machine.

What you pay

🧾 AI ACT COMPLIANCE CONSULTANCY
€5,000–€15,000
3–6 months. They explain the obligations to you.
✓ AICHECK
€249
12 documents. 45 minutes. Solves the documentation.

Technical documentation and conformity assessment: two layers

● LAYER 1

Technical documentation — Annex IV

12 documents. 45 minutes. €249. The documentation your system needs before being placed on the market.

∅ LAYER 2

Conformity assessment by notified body

If your system falls under Art. 43(1) (Annex III point 1 biometrics with notified-body route, or Annex I products), you will need third-party conformity assessment. That is a separate process. AICheck does not replace it.

We do not sell audits. We do not sell consultancy. We sell the tool that structures your documentation under Annex IV.

Penalty regime

Article 99 of Regulation (EU) 2024/1689. Chapter XII (Penalties) applies from 2 August 2025.

🇪🇺
Non-compliance with prohibited practices (Art. 5)
€35M / 7%

Art. 99(3). Up to €35 million or 7% of total worldwide annual turnover, whichever is higher. For SMEs and start-ups: whichever is lower (Art. 99(6)).

🇪🇺
Non-compliance with operator obligations (high-risk, transparency, deployer)
€15M / 3%

Art. 99(4). Includes failure to draw up technical documentation under Art. 11 + Annex IV. Covers obligations of providers (Art. 16), deployers (Art. 26), authorised representatives (Art. 22), importers (Art. 23), distributors (Art. 24), notified bodies (Art. 31, 33, 34) and transparency under Art. 50.

🇪🇺
Supply of incorrect, incomplete or misleading information
€7.5M / 1%

Art. 99(5). Applies when information provided to notified bodies or national competent authorities is wrong or misleading.

Documenting 5 or more AI systems?

If you operate multiple AI systems and need to document them all under Annex IV, contact us for volume pricing at hello@solidwaretools.com.

Request volume pricing
Reply within one business day

What AICheck guarantees, and what it does not

AICheck produces a document structured under Article 11 and Annex IV of Regulation (EU) 2024/1689 from the information you provide. The accuracy, truthfulness and completeness of that information is your responsibility as provider of the AI system.

We guarantee that the document structure follows Article 11 and Annex IV of Regulation (EU) 2024/1689 and that the legal references cited are correct as of the last verification date. We do not guarantee that a specific document will be accepted by a market surveillance authority in a given case, nor by a commercial buyer in a procurement process.

AICheck is not legal advice. For specific situations, consult a lawyer or specialised regulatory consultancy.

Frequently asked questions

What is the difference between Art. 6(1) and Art. 6(2) high-risk?
Art. 6(1) catches AI systems that are safety components of products covered by Annex I harmonisation legislation (machinery, toys, lifts, medical devices, in vitro diagnostic devices, automotive, etc.) AND require third-party conformity assessment. Art. 6(2) refers to the 8 domains in Annex III. Both routes trigger the same Chapter III, Section 2 requirements, but the conformity-assessment procedure is different: Art. 6(1) follows the sectoral regime, Art. 6(2) follows Art. 43 (internal control under Annex VI for most cases).
Can I avoid Annex III via the Art. 6(3) derogation?
Only if the system fits one of four narrow cases and does not pose a significant risk: (a) narrow procedural task; (b) improvement of a previously completed human activity; (c) detection of decision patterns without replacing human review; (d) preparatory task. Critically, if the system performs profiling of natural persons it is always high-risk — the derogation is unavailable (Art. 6(3), final subparagraph). The provider must still document the assessment under Art. 6(4) and register the system in the EU database under Art. 49(2).
Does my insurance pricing AI fall under Annex III?
Yes, if it performs risk assessment and pricing of natural persons for life or health insurance — Annex III 5(c). Other insurance lines (motor, home, commercial) are not covered by 5(c). For Annex III 5(c) deployments by public bodies or private entities providing public services, a fundamental rights impact assessment is required under Art. 27.
Is this a subscription?
No. One-time payment. The licence includes 30 days of editing and 10 regenerations. The PDF you download is yours to keep.
Can I request a refund?
Pursuant to Article 16(m) of Directive (EU) 2011/83 on consumer rights, by activating the licence you give express consent to the immediate generation of digital content, waiving the 14-day withdrawal right. Refunds are only accepted in the case of a reproducible technical failure.
What if the regulation changes?
If the regulation changes while your licence is active, you can regenerate the document with the updated version of the generator at no additional cost.
⚠️ Important notice: AICheck is a documentary self-assessment tool, not legal advice nor a third-party audit. The document under Article 11 and Annex IV of Regulation (EU) 2024/1689 is generated from the data you input. The accuracy of that data is your responsibility. AICheck does not replace a qualified professional assessment.

Don't wait for the consultancy. Generate the Annex IV documentation for your AI system in your browser in 45 minutes.

Twelve documents. Annex IV fully structured. Regulation (EU) 2024/1689. Your data does not leave your machine. The ZIP you download is yours to keep.

€249 one-time payment
12 professional documents · 45 minutes · No subscription · 100% in your browser
Generate dossier — €249
✓ Last regulatory verification: 11 May 2026 · No substantive changes detected · View history