Skip to main contentSkip to main content
Legal Basis

Self-Assessment Under the EU AI Act

Art. 43(2) makes internal self-assessment the default conformity path for most high-risk AI systems. A notified body is only required in one narrow case. Here is what the law actually says.

Statutory Text — Art. 43(2)

“For high-risk AI systems listed in Annex III, the provider shall follow the conformity assessment procedure based on internal control as referred to in Annex VI, which does not provide for the involvement of a notified body.”

EU AI Act, Regulation (EU) 2024/1689, Art. 43(2)

Which procedure applies to your system?

AI System TypeProcedureLegal Basis
High-risk AI systems — Annex III (all categories) Self-assessment (Annex VI)Art. 43(2)
High-risk AI systems — Annex I product safety components Self-assessment (Annex VI or VII)Art. 43(3)
Real-time remote biometric identification for law enforcement Notified body required (Annex VII)Art. 43(1)
GPAI models (standard obligations) Self-managed (Title V, not Annex VI/VII)Arts. 51–54
Limited risk (chatbots, synthetic content, emotion recognition)Transparency obligations onlyArt. 50
Minimal riskNo mandatory procedure

The Annex VI Internal Control Procedure — Step by Step

1

Establish a Quality Management System (Art. 17)

Document your QMS covering: development methodology, design decisions, risk management processes, data governance, testing procedures, post-market monitoring, and incident reporting. The QMS must be proportionate to your organisation size.

2

Prepare Technical Documentation (Annex IV)

All 8 sections of Annex IV must be documented before placing the system on the market: general description, design and development, training data, validation/testing, risk management, post-market monitoring, instructions for use, and any changes to already-placed systems.

3

Conduct the Risk Management Process (Art. 9)

Identify and analyse known and reasonably foreseeable risks. Implement risk mitigation measures. Evaluate residual risk. The risk management process must be continuous — updated whenever the system is modified or new risks emerge.

4

Complete Testing and Validation (Art. 9(7))

Test performance across the intended purpose against the persons and groups of persons on whom the system is to be used. Validate that technical requirements (accuracy, robustness, cybersecurity) are met. Document your methodology and results.

5

Draw Up the EU Declaration of Conformity (Art. 47)

Complete the written declaration that the AI system conforms to all applicable requirements of the EU AI Act. The declaration must include: system identification, provider details, assertion of conformity, list of standards applied, date, and authorised signature.

6

Affix CE Marking (Art. 48)

Apply the CE marking visibly and legibly to the AI system or its accompanying documentation. Where the system is embedded in a product covered by other Union harmonisation legislation, a single CE marking covers all applicable requirements.

7

Register in the EU AI Database (Art. 49)

Providers of standalone high-risk AI systems listed in Annex III must register the system in the EU AI Office database before placing it on the market. Deployers of certain systems (Art. 49(2)) have separate registration obligations.

Frequently asked questions

Can I self-certify compliance with the EU AI Act?

For most high-risk AI systems, yes. Art. 43(2) makes the Annex VI internal control procedure the default. You assess your own system against the requirements, document your process, draw up a Declaration of Conformity, and affix a CE mark. No third-party involvement is legally required unless you fall within the biometric identification exception.

Do I need an external auditor?

Not legally, unless your system uses real-time remote biometric identification for law enforcement (Art. 43(1) + Annex VII). For all other high-risk AI systems, Art. 43(2) permits — and in fact presumes — internal self-assessment. However, independent review is prudent before regulatory submission or public disclosure.

What if a national authority challenges my self-assessment?

You must be able to produce your full technical documentation (Annex IV), your risk management records (Art. 9), your quality management system documentation (Art. 17), your test results, and your Declaration of Conformity (Art. 47) on request. This is why generating and retaining your evidence package is critical — the self-assessment must be substantive, not nominal.

Does self-assessment mean less scrutiny?

No. The EU AI Act requires market surveillance authorities to carry out post-market monitoring. AI systems that cause serious incidents must be reported (Art. 73). Authorities can request full documentation, order corrective action, or withdraw systems from the market. Self-assessment shifts the burden to you, but does not reduce accountability.

What about GPAI models?

General Purpose AI models are governed by Title V (Arts. 51–55), not the conformity assessment procedure. They are not subject to the Annex VI/VII procedure. However, GPAI providers must maintain technical documentation, implement copyright compliance policies, and — for systemic risk models — conduct adversarial testing.

Legal disclaimer

This page provides an educational summary of EU AI Act conformity assessment procedures. It is not legal advice. The EU AI Act is a complex regulation — exact obligations depend on your specific system, sector, and deployment context. Consult a qualified EU AI Act legal specialist before relying on any self-assessment for regulatory or procurement purposes. Always verify against the official text on EUR-Lex.