Chatbots and virtual assistants sit at the intersection of two AI Act regimes. Article 50 — applicable to every chatbot interacting with a natural person regardless of risk class — requires the provider to design the system so users are informed they are interacting with AI, and the provider of generative chatbots to mark synthetic outputs as machine-readable. Where the chatbot is also used in an Annex III domain — for example, an HR assistant filtering candidates (4(a)), a banking assistant making credit-related decisions (5(b)), an education assistant assessing learning (3(b)) — the system additionally becomes a high-risk AI system, triggering the full Chapter III, Section 2 requirements. Both layers apply from 2 August 2026 under Art. 113.
€249 one-time payment · 12 PDF documents in ZIP · 45 minutes · 100% in your browser
Three distinct regimes can apply to the same chatbot. Map them in this order: the disclosure obligation always applies; synthetic content marking applies when the chatbot generates content; high-risk obligations apply when the use case sits in Annex III.
The Art. 50(1) exemption applies where the AI nature is "obvious from the point of view of a natural person who is reasonably well-informed, observant and circumspect, taking into account the circumstances and the context of use". A modern customer-service assistant — branded with a human-sounding name, capable of natural conversation, integrated into a help interface — does not necessarily satisfy that standard. Conservative reading is to disclose at first interaction (Art. 50(5)).
Art. 50(2) covers AI systems generating synthetic audio. A voice assistant that synthesises speech generates synthetic audio output. The provider must ensure the output is marked machine-readable as artificially generated. The Art. 50(2) exemption for assistive functions applies to standard editing or non-substantive alteration — not to fundamental voice synthesis.
Annex III 4(a) lists "AI systems intended to be used for the recruitment or selection of natural persons, in particular to place targeted job advertisements, to analyse and filter job applications, and to evaluate candidates". An HR chatbot that filters or evaluates candidates falls under 4(a) and is high-risk under Art. 6(2) — triggering Art. 11 + Annex IV documentation, Art. 17 QMS, Art. 43 conformity assessment, Art. 26 deployer obligations, plus Art. 50 transparency on top.
Answer these four questions to determine your obligations.
12 PDF documents generated from your inputs. Each cites the article of Regulation (EU) 2024/1689 it fulfils.
Identifies whether your system is prohibited (Art. 5), high-risk (Art. 6 + Annex III) or subject to transparency obligations (Art. 50).
The 9 blocks of Annex IV in full: system description, training data, validation, performance metrics, risk management, human oversight. Art. 11 + Annex IV.
Signable document conforming to Art. 47 and Annex V.
Key application dates: 2 Feb 2025, 2 Aug 2025, 2 Aug 2026, 2 Aug 2027. Art. 113.
Executive summary of compliance status for authorities or commercial buyers. Art. 43 procedure.
QMS structure covering the 13 aspects required by Art. 17.
Document for the entity deploying your system, conforming to Art. 13.
Verifiable evidence list, cross-referenced to every Annex IV block.
Notification protocol conforming to Art. 73 (15 days general / 10 days death / 2 days widespread).
Training plan conforming to Art. 4, in force since 2 February 2025.
Plan structure required by Art. 72 and integrated into the technical documentation under Annex IV(9).
Template under Art. 27 for public bodies, private entities providing public services, and Annex III 5(b)(c) deployers.
See before you buy — Download a sample dossier (PDF, fictional company) — Real structure, real articles, real format. Fictional data.
Generated from your inputs, in your browser. No data leaves your machine.
12 documents. 45 minutes. €249. The documentation your system needs before being placed on the market.
If your system falls under Art. 43(1) (Annex III point 1 biometrics with notified-body route, or Annex I products), you will need third-party conformity assessment. That is a separate process. AICheck does not replace it.
We do not sell audits. We do not sell consultancy. We sell the tool that structures your documentation under Annex IV.
Article 99 of Regulation (EU) 2024/1689. Chapter XII (Penalties) applies from 2 August 2025.
Art. 99(3). Up to €35 million or 7% of total worldwide annual turnover, whichever is higher. For SMEs and start-ups: whichever is lower (Art. 99(6)).
Art. 99(4). Includes failure to draw up technical documentation under Art. 11 + Annex IV. Covers obligations of providers (Art. 16), deployers (Art. 26), authorised representatives (Art. 22), importers (Art. 23), distributors (Art. 24), notified bodies (Art. 31, 33, 34) and transparency under Art. 50.
Art. 99(5). Applies when information provided to notified bodies or national competent authorities is wrong or misleading.
If you operate multiple AI systems and need to document them all under Annex IV, contact us for volume pricing at hello@solidwaretools.com.
Request volume pricingAICheck produces a document structured under Article 11 and Annex IV of Regulation (EU) 2024/1689 from the information you provide. The accuracy, truthfulness and completeness of that information is your responsibility as provider of the AI system.
We guarantee that the document structure follows Article 11 and Annex IV of Regulation (EU) 2024/1689 and that the legal references cited are correct as of the last verification date. We do not guarantee that a specific document will be accepted by a market surveillance authority in a given case, nor by a commercial buyer in a procurement process.
AICheck is not legal advice. For specific situations, consult a lawyer or specialised regulatory consultancy.
Twelve documents. Annex IV fully structured. Regulation (EU) 2024/1689. Your data does not leave your machine. The ZIP you download is yours to keep.