Article 50 of Regulation (EU) 2024/1689 introduces transparency obligations that apply to specific categories of AI systems — regardless of whether the system is also classified as high-risk under Annex III. The obligations sit in Chapter IV and apply from 2 August 2026 under the general application date in Art. 113. Four distinct cases: (1) AI systems interacting directly with natural persons must disclose their AI nature; (2) providers of generative AI must mark synthetic content as machine-readable; (3) deployers of emotion-recognition or biometric-categorisation systems must inform exposed persons; (4) deployers of deep-fake systems must disclose the artificial nature of the content.
€249 one-time payment · 12 PDF documents in ZIP · 45 minutes · 100% in your browser
Each Article 50 paragraph has its own trigger, its own responsible party (provider or deployer) and its own exemption. Read carefully — "obvious from context" exemptions are narrow and don't cover most commercial use cases.
Article 50 is in Chapter IV and is independent of Annex III high-risk classification. A chatbot that is not high-risk under Annex III still triggers Art. 50(1). A consumer image-generation tool that is not high-risk still triggers Art. 50(2) synthetic-content marking. A wellbeing app using emotion recognition outside workplace/education still triggers Art. 50(3) deployer-side notice.
Art. 3(60) defines deep fake as "AI-generated or manipulated image, audio or video content that resembles existing persons, objects, places, entities or events and would falsely appear to a person to be authentic or truthful". The definition covers audio (voice cloning), image (face swap, photo manipulation) and video. Art. 50(4) second subparagraph extends a similar disclosure obligation to AI-generated text published on matters of public interest, with carve-outs for editorial review and law-enforcement authorisation.
The exemption applies where the AI nature is obvious to a "reasonably well-informed, observant and circumspect" natural person, taking into account the circumstances and context of use. Standard for the exemption is narrow: a website visitor encountering a voice or messaging assistant that sounds human, even if labelled with a name, is not necessarily on notice. Conservative reading is to disclose at the first interaction (Art. 50(5)).
Answer these four questions to determine your obligations.
12 PDF documents generated from your inputs. Each cites the article of Regulation (EU) 2024/1689 it fulfils.
Identifies whether your system is prohibited (Art. 5), high-risk (Art. 6 + Annex III) or subject to transparency obligations (Art. 50).
The 9 blocks of Annex IV in full: system description, training data, validation, performance metrics, risk management, human oversight. Art. 11 + Annex IV.
Signable document conforming to Art. 47 and Annex V.
Key application dates: 2 Feb 2025, 2 Aug 2025, 2 Aug 2026, 2 Aug 2027. Art. 113.
Executive summary of compliance status for authorities or commercial buyers. Art. 43 procedure.
QMS structure covering the 13 aspects required by Art. 17.
Document for the entity deploying your system, conforming to Art. 13.
Verifiable evidence list, cross-referenced to every Annex IV block.
Notification protocol conforming to Art. 73 (15 days general / 10 days death / 2 days widespread).
Training plan conforming to Art. 4, in force since 2 February 2025.
Plan structure required by Art. 72 and integrated into the technical documentation under Annex IV(9).
Template under Art. 27 for public bodies, private entities providing public services, and Annex III 5(b)(c) deployers.
See before you buy — Download a sample dossier (PDF, fictional company) — Real structure, real articles, real format. Fictional data.
Generated from your inputs, in your browser. No data leaves your machine.
12 documents. 45 minutes. €249. The documentation your system needs before being placed on the market.
If your system falls under Art. 43(1) (Annex III point 1 biometrics with notified-body route, or Annex I products), you will need third-party conformity assessment. That is a separate process. AICheck does not replace it.
We do not sell audits. We do not sell consultancy. We sell the tool that structures your documentation under Annex IV.
Article 99 of Regulation (EU) 2024/1689. Chapter XII (Penalties) applies from 2 August 2025.
Art. 99(3). Up to €35 million or 7% of total worldwide annual turnover, whichever is higher. For SMEs and start-ups: whichever is lower (Art. 99(6)).
Art. 99(4). Includes failure to draw up technical documentation under Art. 11 + Annex IV. Covers obligations of providers (Art. 16), deployers (Art. 26), authorised representatives (Art. 22), importers (Art. 23), distributors (Art. 24), notified bodies (Art. 31, 33, 34) and transparency under Art. 50.
Art. 99(5). Applies when information provided to notified bodies or national competent authorities is wrong or misleading.
If you operate multiple AI systems and need to document them all under Annex IV, contact us for volume pricing at hello@solidwaretools.com.
Request volume pricingAICheck produces a document structured under Article 11 and Annex IV of Regulation (EU) 2024/1689 from the information you provide. The accuracy, truthfulness and completeness of that information is your responsibility as provider of the AI system.
We guarantee that the document structure follows Article 11 and Annex IV of Regulation (EU) 2024/1689 and that the legal references cited are correct as of the last verification date. We do not guarantee that a specific document will be accepted by a market surveillance authority in a given case, nor by a commercial buyer in a procurement process.
AICheck is not legal advice. For specific situations, consult a lawyer or specialised regulatory consultancy.
Twelve documents. Annex IV fully structured. Regulation (EU) 2024/1689. Your data does not leave your machine. The ZIP you download is yours to keep.