Reg (EU) 2024/1689Generate dossier — €249
LIVE — Fines tracker · Obligations calendar · Transposition status — Updated weekly from EUR-Lex, Safety Gate, OEIL and 12 official sourcesView regulatory intelligence →

AI Act Article 27: the fundamental rights impact assessment for high-risk AI, who must perform it and what the six elements look like.

Article 27 of Regulation (EU) 2024/1689 introduces the fundamental rights impact assessment (FRIA), an obligation that sits on the deployer side — not the provider side — for specific Annex III high-risk AI systems. It applies to deployers that are bodies governed by public law, deployers that are private entities providing public services, and deployers of high-risk AI systems referred to in Annex III points 5(b) (credit scoring, except fraud detection) and 5(c) (life and health insurance pricing). It does NOT apply to deployments of Annex III point 2 systems (critical infrastructure). The assessment must be performed before deployment and notified to the market surveillance authority. Where a DPIA under Art. 35 GDPR already covers the relevant elements, the FRIA complements rather than replaces it.

Generate AI Act dossier — €249Free: check your AI system risk

€249 one-time payment · 12 PDF documents in ZIP · 45 minutes · 100% in your browser

Regulation (EU) 2024/1689 · Article 11 + Annex IV · 12 documents · 100% browser-side — your data never leaves your machine

The numbers

6 elements
Art. 27(1) lists six elements (a–f) that the deployer must assess and document before first deployment.
Public + 5(b)(c)
Bodies governed by public law, private entities providing public services, deployers of Annex III 5(b) credit scoring and 5(c) insurance.
NOT for Annex III 2
Art. 27(1) expressly excludes deployments of Annex III point 2 (critical infrastructure) from the FRIA obligation.

Who must perform a FRIA, and the six elements

Article 27 is one of the few obligations that sits primarily on deployers, not providers. The scope is narrow but the consequences are real: a FRIA must be performed before the first use and updated whenever an element changes.

Who must do a FRIA (Art. 27(1))
Prior to deploying a high-risk AI system referred to in Art. 6(2) — with the exception of high-risk AI systems intended to be used in the area listed in point 2 of Annex III — the following deployers must perform a FRIA: (i) bodies governed by public law; (ii) private entities providing public services; (iii) deployers of high-risk AI systems referred to in points 5(b) and (c) of Annex III (credit scoring except fraud detection, life and health insurance pricing).
a
Art. 27(1)(a) — Deployer processes
A description of the deployer's processes in which the high-risk AI system will be used in line with its intended purpose.
b
Art. 27(1)(b) — Time period and frequency
A description of the period of time within which, and the frequency with which, each high-risk AI system is intended to be used.
c
Art. 27(1)(c) — Affected categories of natural persons
The categories of natural persons and groups likely to be affected by its use in the specific context.
d
Art. 27(1)(d) — Specific risks of harm
The specific risks of harm likely to have an impact on the categories identified in (c), taking into account the information given by the provider pursuant to Art. 13 (transparency and provision of information to deployers).
e
Art. 27(1)(e) — Human-oversight measures
A description of the implementation of human-oversight measures, according to the instructions for use provided by the provider.
f
Art. 27(1)(f) — Risk-materialisation arrangements
The measures to be taken in the case of the materialisation of those risks, including the arrangements for internal governance and complaint mechanisms.
Timing, updates and notification (Art. 27(2), (3))
Art. 27(2): the obligation applies to the first use of the high-risk AI system. In similar cases, deployers may rely on previously conducted FRIAs or existing impact assessments by the provider. If any of the elements changes or becomes out of date, the deployer must take the necessary steps to update. Art. 27(3): once the assessment is performed, the deployer notifies the market surveillance authority of the results, submitting the filled-out template referred to in Art. 27(5). In the case referred to in Art. 46(1) (derogation procedure), deployers may be exempt from notification.
Interaction with DPIA under Art. 35 GDPR (Art. 27(4))
If any of the Art. 27 obligations is already met through the data protection impact assessment conducted under Art. 35 of Regulation (EU) 2016/679 or Art. 27 of Directive (EU) 2016/680, the FRIA shall complement that DPIA. The two assessments are independent obligations but can be performed jointly. The AI Office develops a template questionnaire under Art. 27(5).

Three common mistakes

COMMON MISTAKE

"FRIA is required for every high-risk AI system"

No. FRIA scope under Art. 27(1) is narrower than general high-risk classification. It applies to (i) deployers that are bodies governed by public law, (ii) private entities providing public services, and (iii) deployers of Annex III points 5(b) (credit scoring) and 5(c) (insurance). High-risk AI systems in Annex III points 1, 2, 3, 4, 6, 7, 8 — outside the public-service context — do NOT trigger a FRIA for the deployer.

COMMON MISTAKE

"FRIA replaces the DPIA"

Art. 27(4) — where the Art. 35 GDPR DPIA already covers the relevant Art. 27 elements, the FRIA shall complement that DPIA. The two assessments are distinct obligations under different regulations: the DPIA assesses processing of personal data; the FRIA assesses impact on fundamental rights of the AI deployment. They can share inputs and be performed jointly, but neither replaces the other.

COMMON MISTAKE

"FRIA is a one-time exercise at deployment"

Art. 27(2) — the FRIA applies to the first use. But if, during the use of the high-risk AI system, the deployer considers that any of the elements listed in paragraph 1 has changed or is no longer up to date, the deployer shall take the necessary steps to update the information. Changes to processes, frequency, affected groups, identified risks or oversight arrangements all trigger updates.

Does the AI Act apply to your system?

Answer these four questions to determine your obligations.

Does your system use machine learning, logic-based, or statistical approaches?
Art. 3(1) — definition of "AI system"
Is the system placed on the EU market or does its output affect persons in the EU?
Art. 2(1) — territorial scope (extraterritorial via 2(1)(c))
Is your system used in any Annex III domain? (employment, credit, education, law enforcement, migration, justice, critical infrastructure, biometrics)
Art. 6(2) + Annex III — high-risk classification
Are you the provider (developer) or the deployer (user) of the system?
Art. 3(3) provider · Art. 3(4) deployer — different obligations

Take the full AI Act risk classification test →

What the ZIP contains

12 PDF documents generated from your inputs. Each cites the article of Regulation (EU) 2024/1689 it fulfils.

1

Risk Classification Report

Identifies whether your system is prohibited (Art. 5), high-risk (Art. 6 + Annex III) or subject to transparency obligations (Art. 50).

2

Technical Documentation

The 9 blocks of Annex IV in full: system description, training data, validation, performance metrics, risk management, human oversight. Art. 11 + Annex IV.

3

EU Declaration of Conformity

Signable document conforming to Art. 47 and Annex V.

4

Compliance Calendar

Key application dates: 2 Feb 2025, 2 Aug 2025, 2 Aug 2026, 2 Aug 2027. Art. 113.

5

Conformity Sheet

Executive summary of compliance status for authorities or commercial buyers. Art. 43 procedure.

6

Quality Management System (QMS)

QMS structure covering the 13 aspects required by Art. 17.

7

Deployer Instructions

Document for the entity deploying your system, conforming to Art. 13.

8

Evidence Checklist

Verifiable evidence list, cross-referenced to every Annex IV block.

9

Incident Report Template

Notification protocol conforming to Art. 73 (15 days general / 10 days death / 2 days widespread).

10

AI Literacy Programme

Training plan conforming to Art. 4, in force since 2 February 2025.

11

Post-Market Monitoring Plan

Plan structure required by Art. 72 and integrated into the technical documentation under Annex IV(9).

12

Fundamental Rights Impact Assessment (FRIA)

Template under Art. 27 for public bodies, private entities providing public services, and Annex III 5(b)(c) deployers.

See before you buy — Download a sample dossier (PDF, fictional company) — Real structure, real articles, real format. Fictional data.

Generated from your inputs, in your browser. No data leaves your machine.

What you pay

🧾 AI ACT COMPLIANCE CONSULTANCY
€5,000–€15,000
3–6 months. They explain the obligations to you.
✓ AICHECK
€249
12 documents. 45 minutes. Solves the documentation.

Technical documentation and conformity assessment: two layers

● LAYER 1

Technical documentation — Annex IV

12 documents. 45 minutes. €249. The documentation your system needs before being placed on the market.

∅ LAYER 2

Conformity assessment by notified body

If your system falls under Art. 43(1) (Annex III point 1 biometrics with notified-body route, or Annex I products), you will need third-party conformity assessment. That is a separate process. AICheck does not replace it.

We do not sell audits. We do not sell consultancy. We sell the tool that structures your documentation under Annex IV.

Penalty regime

Article 99 of Regulation (EU) 2024/1689. Chapter XII (Penalties) applies from 2 August 2025.

🇪🇺
Non-compliance with prohibited practices (Art. 5)
€35M / 7%

Art. 99(3). Up to €35 million or 7% of total worldwide annual turnover, whichever is higher. For SMEs and start-ups: whichever is lower (Art. 99(6)).

🇪🇺
Non-compliance with operator obligations (high-risk, transparency, deployer)
€15M / 3%

Art. 99(4). Includes failure to draw up technical documentation under Art. 11 + Annex IV. Covers obligations of providers (Art. 16), deployers (Art. 26), authorised representatives (Art. 22), importers (Art. 23), distributors (Art. 24), notified bodies (Art. 31, 33, 34) and transparency under Art. 50.

🇪🇺
Supply of incorrect, incomplete or misleading information
€7.5M / 1%

Art. 99(5). Applies when information provided to notified bodies or national competent authorities is wrong or misleading.

Documenting 5 or more AI systems?

If you operate multiple AI systems and need to document them all under Annex IV, contact us for volume pricing at hello@solidwaretools.com.

Request volume pricing
Reply within one business day

What AICheck guarantees, and what it does not

AICheck produces a document structured under Article 11 and Annex IV of Regulation (EU) 2024/1689 from the information you provide. The accuracy, truthfulness and completeness of that information is your responsibility as provider of the AI system.

We guarantee that the document structure follows Article 11 and Annex IV of Regulation (EU) 2024/1689 and that the legal references cited are correct as of the last verification date. We do not guarantee that a specific document will be accepted by a market surveillance authority in a given case, nor by a commercial buyer in a procurement process.

AICheck is not legal advice. For specific situations, consult a lawyer or specialised regulatory consultancy.

Frequently asked questions

What counts as a "private entity providing public services"?
Article 27(1) does not define the term. In recital interpretation and EU law practice, it typically covers entities providing services of general interest under contract or regulated mandate from a public authority — for example, private operators of public hospitals, schools, social services, public transport, or other services that the public would expect to be provided by the state. The scope is fact-specific; conservative reading is to perform a FRIA where the service has a clear public-interest dimension.
What is the relationship between the FRIA and the DPIA?
Independent obligations under separate regulations. Art. 27(4) provides that where the DPIA under Art. 35 GDPR already covers the relevant Art. 27 elements (typically points (a), (b), (c) and (d)), the FRIA complements the DPIA rather than replacing it. The two can be performed as a joint document with cross-references. Art. 26(9) of the AI Act lets deployers reuse the Art. 13 transparency information from the provider to comply with both the DPIA and the FRIA.
When must I update or redo the FRIA?
Art. 27(2) — if, during the use of the high-risk AI system, the deployer considers that any of the elements listed in Art. 27(1) has changed or is no longer up to date. Updates are necessary on substantive changes to: deployer processes (a), period of use or frequency (b), affected categories (c), specific risks of harm (d), human-oversight measures (e), risk-materialisation arrangements (f). The deployer may rely on previously conducted FRIAs in similar cases (Art. 27(2)).
Is this a subscription?
No. One-time payment. The licence includes 30 days of editing and 10 regenerations. The PDF you download is yours to keep.
Can I request a refund?
Pursuant to Article 16(m) of Directive (EU) 2011/83 on consumer rights, by activating the licence you give express consent to the immediate generation of digital content, waiving the 14-day withdrawal right. Refunds are only accepted in the case of a reproducible technical failure.
What if the regulation changes?
If the regulation changes while your licence is active, you can regenerate the document with the updated version of the generator at no additional cost.
⚠️ Important notice: AICheck is a documentary self-assessment tool, not legal advice nor a third-party audit. The document under Article 11 and Annex IV of Regulation (EU) 2024/1689 is generated from the data you input. The accuracy of that data is your responsibility. AICheck does not replace a qualified professional assessment.

Don't wait for the consultancy. Generate the Annex IV documentation for your AI system in your browser in 45 minutes.

Twelve documents. Annex IV fully structured. Regulation (EU) 2024/1689. Your data does not leave your machine. The ZIP you download is yours to keep.

€249 one-time payment
12 professional documents · 45 minutes · No subscription · 100% in your browser
Generate dossier — €249
✓ Last regulatory verification: 11 May 2026 · No substantive changes detected · View history