Skip to main contentSkip to main content
Back to EU AI ActEU AI Act · Article 9

Risk management system

View on EUR-Lex
Plain-English explainer

Article 9 requires every high-risk system to have a documented, continuous, iterative risk management system covering the entire lifecycle. You must identify and analyse known and foreseeable risks, estimate and evaluate risks that emerge from intended use AND foreseeable misuse, evaluate post-market data, and adopt risk-management measures. Crucially, residual risks must be communicated to the deployer and the system tested under realistic conditions. The NIST AI RMF GOVERN/MAP/MEASURE/MANAGE framework is the de facto state of the art.

What you must do

  • Establish a written risk management plan covering the full lifecycle.
  • Identify foreseeable misuse scenarios, not just intended use.
  • Test under realistic conditions before placing on market AND continuously thereafter.
  • Communicate residual risks clearly in the instructions for use.
Official text

The authoritative text of Article 9 is published by the Publications Office of the European Union on EUR-Lex. We link directly to it rather than mirror it, so you always read the current consolidated version straight from the source.

Read Article 9 on EUR-Lex

Source: Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). The only authentic version is the one published in the Official Journal of the European Union.

Tools that help you comply with this article