Reg (EU) 2024/2847Generate dossier — €149
LIVE — Enforcement tracker · Deadline dashboard · Transposition status — Updated weekly from EUR-Lex, Safety Gate, OEIL & 12 official sourcesView regulatory intelligence →

CRA or AI Act — or both: how Article 12 of Regulation (EU) 2024/2847 maps cybersecurity obligations onto high-risk AI systems classified under Article 6 of Regulation (EU) 2024/1689

Article 12 of the Cyber Resilience Act is the bridge between Regulation (EU) 2024/2847 and the AI Act (Regulation (EU) 2024/1689). A product that is both a high-risk AI system under Article 6 of the AI Act and a product with digital elements under the CRA is in scope of both, but Article 12(1) deems compliance with the AI Act’s Article 15 cybersecurity requirements satisfied when the CRA essential requirements are met. For Important (Annex III) and Critical (Annex IV) products, Article 12(3) carves out the cybersecurity assessment so the CRA procedure prevails. This page maps every scenario. CRACheck generates the documentation that demonstrates both.

Generate CRA dossier — €149Free: check if CRA applies to your product

€149 one-time · 8-document ZIP · 15–25 minutes · Browser-side

Reg (EU) 2024/2847 (CRA) · Reg (EU) 2024/1689 (AI Act) · Art. 12 bridge · Art. 43 AI Act conformity · 100% browser-side

Three coordinates of the CRA/AI Act bridge

Art. 12(1)
Compliance with CRA = deemed compliance with AI Act Art. 15 cybersecurity
Art. 12(2)
AI Act Art. 43 conformity assessment applies
Art. 12(3)
Important / Critical products: CRA procedure prevails for cybersecurity

Five scenarios — which regulation governs

Each row is a typical software product. The right-hand column shows what Article 12 of the CRA tells you to do.

1
Pure software, no AI — only the CRA
Application code, libraries, OS, network tooling without high-risk AI features. Only the CRA applies. Conformity assessment under Article 32: Module A (default), or B+C / H for Important products.
2
AI feature embedded, but NOT a high-risk AI system
An AI module that does not fall under Article 6 of the AI Act (not a high-risk use case, not Annex III of the AI Act). Only the CRA applies for cybersecurity. AI Act transparency obligations may still apply (Art. 50 of the AI Act for limited-risk systems).
3
High-risk AI system that is also a product with digital elements
Article 12(1): if the product fulfils the CRA essential requirements (Annex I, Parts I and II) and the EU declaration of conformity issued under the CRA demonstrates the level of cybersecurity protection required by Article 15 of the AI Act, the product is deemed to comply with Article 15. The AI Act’s Article 43 conformity assessment procedure applies for the broader AI Act compliance (Art. 12(2)).
4
Important product under Annex III of the CRA, also high-risk AI system, AI Act internal control
Article 12(3): where the conformity assessment procedure of Annex VI to the AI Act (internal control) would otherwise apply, the cybersecurity part is subject to the CRA conformity assessment procedures (Art. 32(2)(a), (b), or 32(3)) — NOT to AI Act Annex VI internal control. The AI Act’s internal control covers the rest.
5
Critical product under Annex IV of the CRA, also high-risk AI system
Same rule as for Important products via Art. 12(3). The CRA conformity assessment under Article 32(3) or, where required, the European cybersecurity certification under Article 8 governs the cybersecurity component. The AI Act procedure governs the AI-specific aspects.
6
Manufacturer using AI regulatory sandbox
Article 12(4): manufacturers of products with digital elements that are also high-risk AI systems may participate in the AI regulatory sandboxes referred to in Article 57 of the AI Act.

Common mistakes

DOUBLE COMPLIANCE MYTH

“We need to run two separate conformity assessments”

Not for cybersecurity. Article 12(1) creates a presumption: CRA compliance presumes Article 15 (AI Act cybersecurity) compliance. You produce one EU declaration of conformity under the CRA that covers both, and the AI Act Article 43 procedure handles the rest of the AI Act compliance. Recital 51 confirms the design.

WRONG NOTIFIED BODY

“Our AI notified body can do the CRA cybersecurity assessment”

Only if the AI Act notified body is also competent for CRA assessment under Article 39 of the CRA — and that competence must have been verified during the AI Act notification procedure (Art. 12(2), second sentence). Otherwise you need a CRA-notified body for the cybersecurity assessment of Important / Critical products.

ANNEX VI INTERNAL CONTROL OVERREACH

“Our high-risk AI system can be self-assessed under AI Act Annex VI”

Even when the AI Act allows internal control under Annex VI, Article 12(3) of the CRA forces the cybersecurity component into the CRA procedures for Important and Critical products. Module B+C / H or, where applicable, an EU cybersecurity certificate at ‘substantial’ level are required for that part.

Does the CRA apply to your product?

Four-question self-check. If you answer YES to all four, your product is in scope of Regulation (EU) 2024/2847.

Take the full product classification test →

Choose your licence

One-time payment. No subscription. The downloaded dossier is yours forever.

1 PRODUCT
149
/ product
  • 8-document CRA dossier (ZIP)
  • Product Classifier + Technical Documentation
  • Risk Assessment + User Information
  • 10 regenerations · 30 days
  • 1 licence = 1 product
Buy licence →

What the ZIP contains

8 PDF documents generated from your data. Each cites the specific article of Regulation (EU) 2024/2847 it complies with.

1

Product Classifier

Determines whether your product is Default, Important Class I, Important Class II (Annex III) or Critical (Annex IV). Documents the rationale and the applicable conformity assessment procedure under Article 32.

2

Technical Documentation

Article 31 + Annex VII dossier. Product description, design and development, vulnerability handling processes, risk assessment, list of harmonised standards applied, conformity solutions.

3

Cybersecurity Risk Assessment

Annex I, Part I analysis. Intended purpose, reasonably foreseeable use, operational environment, applicability of each essential requirement, mitigation measures.

4

User Information & Instructions

Annex II. Manufacturer details, single point of contact, intended purpose, support period end date, secure decommissioning, automatic-update opt-out instructions.

5

EU Declaration of Conformity

Article 28 + Annex V. Pre-structured with your classification, applicable conformity module, harmonised standards or certificates relied on, notified body number when applicable.

6

Coordinated Vulnerability Disclosure Policy

Annex I, Part II, point (5). Single point of contact, intake workflow, triage and remediation timeline, public disclosure rules.

7

ENISA Notification Template

Article 14 reporting. Pre-filled 24h early warning, 72h vulnerability/incident notification, 14-day final report templates.

8

Obligations Calendar

Personalised milestones: Article 14 reporting starts 11 September 2026, full application 11 December 2027, document retention 10 years, support period (Art. 13(8)) end date.

See before you buy — Download sample dossier (PDF, fictional company). Real structure, real articles, real format. Fictional data.

Generated from your data, in your browser. No data leaves your device.

What you pay

🔍 LEGAL OPINION ON CRA + AI ACT INTERACTION
€8,000–€30,000
External counsel mapping Article 12 onto your specific high-risk AI system, identifying which conformity procedure governs which part, and reviewing the single EU declaration of conformity.
CRACHECK — SAME OUTPUT
€149
CRACheck applies the Article 12 logic automatically: it asks if the product is a high-risk AI system, whether it falls in Annex III/IV, and produces the EU DoC that covers both regulations.

Legal sources

Every article and recital cited on this page comes from the official text of Regulation (EU) 2024/2847 (Cyber Resilience Act), published in the Official Journal of the European Union on 20 November 2024 (ELI: data.europa.eu/eli/reg/2024/2847/oj).

Related: Regulation (EU) 2019/881 (Cybersecurity Act, EUCC) · Directive (EU) 2022/2555 (NIS2) · Regulation (EU) 2019/1020 (market surveillance) · Regulation (EU) 2024/1689 (AI Act).

Important notice

This is not legal advice. CRACheck is structured self-assessment software based on Regulation (EU) 2024/2847. The dossier you download is structured documentation, not a third-party audit or certification.

Class II and Critical products still need a notified body. CRACheck prepares the dossier that the notified body will examine — it does not replace the third-party conformity assessment required by Article 32(3) and Article 32(4).

Maximum liability: the amount you paid for the licence. Always verify your specific situation with your legal counsel.

Frequently asked questions

If my product is a high-risk AI system AND a CRA product, do I need two EU declarations?
No. Article 28(3) of the CRA requires a single EU declaration of conformity when the product is subject to more than one Union legal act requiring such a declaration. Article 12(1) tells you that the CRA declaration, when properly structured, demonstrates compliance with Article 15 of the AI Act. The declaration identifies both legal acts and their publication references.
Does the CRA replace Article 15 of the AI Act for high-risk AI systems?
It does not replace it — it presumes compliance with it. Article 12(1) lists three cumulative conditions: (a) the product fulfils the CRA essential requirements in Annex I, Part I; (b) manufacturer processes comply with Annex I, Part II; (c) the achievement of the AI Act cybersecurity level is demonstrated in the EU declaration of conformity issued under the CRA. If those are met, Article 15 of the AI Act is deemed satisfied.
Which notified bodies can act for both regulations?
Article 12(2), second sentence: notified bodies which are competent to control the conformity of high-risk AI systems under the AI Act ‘shall also be competent to control the conformity of high-risk AI systems which fall within the scope of this Regulation [the CRA] with the requirements set out in Annex I to this Regulation, provided that the compliance of those notified bodies with the requirements laid down in Article 39 of this Regulation has been assessed in the context of the notification procedure under Regulation (EU) 2024/1689’.
Is my AI Act regulatory sandbox use covered by the CRA?
Article 12(4) allows participation in the AI regulatory sandboxes referred to in Article 57 of the AI Act. Note that the CRA also creates its own ‘cyber resilience regulatory sandboxes’ under Article 33(2). The two are separate sandboxing regimes operated by Member States.
Is this a subscription?
No. One-time payment. 30-day editing window. 10 regenerations. The PDF dossier is yours permanently.
Can I request a refund?
Under Article 16(m) of Directive (EU) 2011/83, the act of licence activation constitutes express consent for immediate digital content generation, which removes the right of withdrawal. Refunds are issued only for reproducible technical failures.
What if the regulation changes before I file my dossier?
Regenerate at no additional cost during your licence validity. Substantive amendments to Regulation (EU) 2024/2847 are tracked weekly from EUR-Lex; if a clause you cited is amended, you can regenerate the affected sections.
€149 one-time
8-document ZIP · 15–25 minutes · Browser-side

One declaration. Both regulations. Article 12 handled.

CRACheck produces the single EU declaration of conformity required by Article 28(3) that covers both Regulation (EU) 2024/2847 and Article 15 of the AI Act — plus the technical documentation entries that the AI Act Annex IV record requires.

Generate dossier — €149