EU AI Act for SMEs & Startups
A practical, simplified compliance path for businesses with fewer than 250 employees. The EU AI Act includes explicit proportionality provisions for SMEs — you are not expected to comply in the same way as a large enterprise. This guide explains what those provisions mean in practice and gives you a 10-step path to compliance.
SME threshold
< 250 employees
Proportionality
Art. 9(7) + Art. 16(h)
Recital
Recital 21
High-risk deadline
Aug 2026
What the Act says about SMEs
Art. 9(7)
Risk management systems must be “proportionate to the size of the provider and the nature and scale of the risks.” SMEs are explicitly named.
Art. 16(h)
SME providers “may provide elements of the technical documentation in a simplified manner.” Reduced documentation burden is legally mandated.
Recital 21
The regulation expressly states that “the financial and administrative burden” on SMEs and startups should be “kept to a minimum.”
10-Step Simplified Compliance Path
Determine if the Act applies at all
ScopeThe EU AI Act applies if you place an AI system on the EU market, put it into service in the EU, or the system's output is used in the EU — regardless of where you are based. If you only develop AI for internal use, for export outside the EU, or for purely personal non-professional use, you may fall outside scope entirely. Check Art. 2 carefully.
Check if your system is prohibited
ProhibitedSix categories of AI are banned outright under Art. 5 from February 2, 2025. These include social scoring by public authorities, real-time remote biometric identification in public spaces (with narrow exceptions), systems that exploit vulnerabilities of persons, and subliminal manipulation techniques. If your system does any of these, you must cease development or deployment immediately.
Determine your risk level
Risk ClassificationMost SME AI systems will be minimal-risk (chatbots, recommendation engines, spam filters) with no mandatory obligations. High-risk systems are listed in Annex III: HR tools, credit scoring, CV-sorting, medical device software, educational assessment, biometric categorisation. If you supply a general-purpose AI model (GPAI), separate rules apply from August 2025.
For high-risk: simplified documentation requirements
High-RiskUnder Art. 9(7), your risk management system must be proportionate to the size of your organisation and the nature and scale of risk. Art. 16(h) explicitly permits SMEs to use simplified technical documentation formats. You do not need the same volume of documentation as a large enterprise — but you must still have a risk management system, technical documentation, data governance practices, and logging.
Self-assessment (most SMEs won't need a notified body)
ConformityThe majority of high-risk AI systems in Annex III can be self-assessed under Art. 43(2) — you do not need to engage an expensive third-party notified body. Exceptions exist for remote biometric identification systems and a narrow set of Annex III systems where third-party assessment is mandatory. Prepare your technical documentation and EU declaration of conformity in-house.
Use free resources: EU AI Office sandbox and standardisation
Free ResourcesArt. 57 requires Member States to establish at least one regulatory sandbox, with priority access for SMEs and startups. These sandboxes allow you to develop, test, and validate your AI system in a controlled environment before market placement, with reduced regulatory risk. Sandbox participation is free. The EU AI Office also publishes templates, guidance, and standardisation inputs.
CE marking if placing a product on the market
CE MarkingIf your high-risk AI system is a standalone product or embedded in a product (e.g. a medical device app, safety component), it must carry CE marking. The CE marking signals conformity with the EU AI Act and any other applicable regulations (MDR, Machinery Regulation, etc.). You affix CE marking after completing your conformity assessment and issuing an EU declaration of conformity.
Post-market monitoring (simplified for SMEs)
MonitoringArt. 72 requires providers of high-risk AI systems to operate a post-market monitoring plan. For SMEs, this does not have to be a complex system — it means tracking performance, collecting user feedback, logging incidents, and reviewing the system against your initial risk assessment at reasonable intervals. If something goes wrong, serious incidents must be reported to national market surveillance authorities.
Deployer obligations if using third-party AI
DeployerIf you are deploying an AI system built by someone else (e.g. a SaaS HR tool, a credit scoring API), you are a 'deployer' under Art. 3(4). You still have obligations: conduct a fundamental rights impact assessment for certain high-risk deployments, ensure human oversight, report serious incidents, and ensure staff using the system have adequate AI literacy. Check your supplier contracts for provider obligations passed down to you.
Voluntary codes of conduct for minimal-risk systems
Minimal-RiskIf your AI system is minimal-risk — the majority of consumer and B2B AI tools — no mandatory obligations apply. You may voluntarily adopt codes of conduct under Art. 95, which can improve trust with customers and enterprise buyers. The EU AI Office is developing voluntary commitments around transparency, bias testing, and environmental impact. Voluntary compliance is a competitive advantage.
Key SME Exemptions & Proportionality Provisions
| Provision | What it means for you |
|---|---|
| Proportionate risk management (Art. 9(7)) | The risk management system must be proportionate to the size of the provider and the nature and scale of the risks associated with the AI system. SMEs are not expected to match enterprise-scale risk management processes. |
| Simplified technical documentation (Art. 16(h)) | Providers that are SMEs may provide elements of technical documentation in a simplified manner. The EU AI Office must issue guidance on what simplified documentation looks like for SMEs. |
| Priority sandbox access (Art. 57(3)) | National regulatory sandboxes must be accessible to SMEs and startups as a priority. Where an SME or startup requests admission to a sandbox, competent authorities shall give it due consideration. |
| Recital 21 — proportionality principle | Recital 21 explicitly states that the obligations for SMEs and startups should be proportionate and that the financial and administrative burden imposed on them should be kept to a minimum. |
| No mandatory notified body for most systems | Self-assessment under Art. 43(2) is available for most Annex III high-risk systems, meaning SMEs can complete conformity without paying for external third-party assessment. |
Indicative Compliance Cost Estimates for SMEs
These are rough indicative ranges for a single AI system. Actual costs depend heavily on complexity, existing documentation, and whether in-house expertise is available. All costs exclude VAT.
| Activity | Cost estimate |
|---|---|
| Scope + risk classification review | €500–€2,000 |
| Technical documentation (simplified SME format) | €1,500–€5,000 |
| Risk management system setup | €1,000–€3,000 |
| Regulatory sandbox participation | Free |
| Conformity assessment (self-assessment) | €0–€1,000 |
| Third-party notified body (if required) | €10,000–€40,000 |
| Ongoing monitoring (per year) | €500–€2,000 |
Free Resources from the EU AI Office
AIAuditRef Tools for SMEs
All AIAuditRef tools are free to use and are designed with SMEs in mind. Start with the Risk Classifier to determine your system's risk level, then use the Checklist Builder to generate a proportionate compliance checklist.
Start your SME compliance review
Use the Risk Classifier to find out in minutes which tier your AI system falls into.
Classify your AI system →