LLM API Provider (GPAI)
How NeuralAPI SA, a Swiss company providing a large language model API to EU developers, navigated general-purpose AI (GPAI) obligations under Art. 53 — covering technical documentation, copyright compliance, downstream provider information, and AI Office registration.
Company
NeuralAPI SA
Jurisdiction
Switzerland (Geneva)
Employees
85
GPAI classification
Standard GPAI (Art. 53)
Company Profile
About NeuralAPI SA
NeuralAPI SA trains and operates a proprietary large language model, exposed to the market as a REST API. The API is used by 2,000+ EU-based developers and businesses to build applications ranging from document summarisation and code assistance to customer service chatbots and internal knowledge management tools. NeuralAPI has no consumer-facing product of its own — its customers are developers and businesses (downstream providers and deployers) who integrate the API into their own products. Although NeuralAPI is incorporated in Switzerland, it places its model on the EU market and is therefore subject to the EU AI Act's GPAI provisions.
Business model
- • API product — pay-per-token pricing with monthly subscription tiers
- • 2,000+ active EU developer and business customers
- • Revenue: API usage fees + enterprise contracts
- • Markets: DE, FR, NL, SE, PL, IT, ES, and 14 other EU member states
Model profile
- • Transformer-based LLM, 34B parameters
- • Training compute: 2×10²⁴ FLOPs
- • Training data: multilingual web corpus, books, code (2020–2024)
- • Context window: 128K tokens; supports text generation, classification, embeddings
EU AI Act — GPAI Classification
Standard GPAI — Not Systemic Risk
Threshold analysis
The EU AI Act designates GPAI models with systemic risk as those trained using total compute exceeding 10²⁵ FLOPs (Art. 51(1)(a)) or those otherwise determined to have high-impact capabilities by the AI Office (Art. 51(1)(b)). NeuralAPI's model was trained using approximately 2×10²⁴ FLOPs — roughly one-fifth of the systemic risk threshold. NeuralAPI therefore falls under the standard GPAI provider obligations in Art. 53, not the additional systemic risk obligations in Art. 55. The compliance deadline for all GPAI providers was 2 August 2025.
NeuralAPI — standard GPAI (Art. 53)
- • Training compute: 2×10²⁴ FLOPs
- • Obligations: Art. 53(1)(a–d)
- • Not applicable: adversarial testing, incident reporting, AI Office model evaluation
- • Compliance deadline: 2 August 2025
Systemic risk threshold (not applicable here)
- • Threshold: >10²⁵ FLOPs (Art. 51(1)(a))
- • Additional obligations: Art. 55
- • Includes: adversarial testing, incident reporting, cybersecurity
- • NeuralAPI is approximately 5× below this threshold
Art. 53 Obligations
Compliance Checklist — NeuralAPI's Status
67-page technical documentation package produced covering: transformer architecture specification, pre-training and fine-tuning methodology, training data composition and sources, evaluation results across 12 benchmark tasks, known limitations and refusal behaviour, and intended use cases. Held internally; available to AI Office on request.
Comprehensive model card published in API documentation covering capabilities, limitations, and appropriate use cases. API documentation updated with capability assessments for common downstream application types. Developer agreement updated to clarify provider/deployer compliance responsibilities. Enterprise compliance pack created for customers building Annex III high-risk systems.
Copyright compliance policy published. TDM opt-out detection process implemented: web crawl pipeline now checks and honours robots.txt TDM directives and dedicated TDM reservation notices (per DSM Directive Art. 4(3)). Licensing review of all third-party datasets completed; 3 datasets removed due to unclear licensing terms.
Public training data summary published at neuralapi.com/model/transparency. Covers: data categories (web text, books, code, multilingual data), language distribution, approximate token counts per category, data collection period (2020–2024), and filtering and quality criteria applied. Level of detail calibrated against AI Office draft guidance; specific source URLs withheld on competitive and legal grounds.
AI Office registration portal opened in phases from mid-2025. NeuralAPI submitted initial registration information on 1 August 2025 — one day before the mandatory deadline. Awaiting confirmation of registration number. Full technical documentation package prepared for submission on request.
Written mandate executed with a Brussels-based regulatory consultancy appointed as EU authorised representative in April 2025. Representative contact details published on company website, in API documentation, and in technical documentation.
NeuralAPI joined as an early adopter signatory to the GPAI Code of Practice in March 2025, before the August 2025 compliance deadline. Participation covers transparency, copyright, and safety commitments. Early adoption provided access to draft AI Office guidance and created a rebuttable presumption of Art. 53 compliance.
Internal Organisation
How NeuralAPI Structured Its Compliance Team
As an 85-person company, NeuralAPI could not justify a dedicated legal department. The GPAI compliance programme was led by a cross-functional working group of four people: the VP of Engineering (technical documentation and architecture), the Head of Data (training data summary and copyright policy), the General Counsel (legal analysis, Code of Practice, AI Office registration), and the Head of Product (developer documentation and model card). The group met fortnightly from January 2025 and was supported by two external advisers: a Geneva-based technology law firm with EU AI Act specialisation, and an independent AI safety consultant for benchmark evaluation design.
Internal team
- • VP Engineering — technical documentation lead
- • Head of Data — training data summary and copyright
- • General Counsel — legal analysis and AI Office liaison
- • Head of Product — model card and developer docs
External support
- • Technology law firm — EU AI Act and copyright legal advice
- • AI safety consultant — benchmark evaluation design
- • Estimated external legal spend: ~€35,000 over 8 months
- • Internal time: ~0.5 FTE equivalent across working group
Key Challenges
What Made Compliance Difficult
1. What counts as “sufficient detail” in technical documentation
Art. 53(1)(a) requires technical documentation, but the Act does not specify a precise format or minimum content standard for standard GPAI providers (unlike the detailed Annex IV template applicable to high-risk systems). NeuralAPI's legal team reviewed the AI Office's draft guidance and the GPAI Code of Practice documentation templates to calibrate the expected level of detail. The key tension was between regulatory sufficiency and competitive sensitivity: disclosing too much about training data composition or model architecture could disadvantage NeuralAPI commercially. The final 67-page technical file was designed to fully satisfy the regulatory floor while limiting disclosure of genuinely proprietary methodology.
2. Retroactive copyright compliance for training data
The EU's DSM Directive (Art. 4) allows text and data mining for commercial purposes unless rightholders have expressly reserved this right. Implementing a robust TDM opt-out detection process retroactively — after the model had already been trained — raised difficult questions. NeuralAPI could not un-train on data that had been ingested, but it could: (1) document its good-faith crawl practices at the time of training, (2) publish a clear policy for future training runs, and (3) remove datasets from the documented training corpus where licensing was unclear. Three third-party datasets were removed from the training data summary after the licensing review found terms inconsistent with commercial TDM use.
3. Downstream provider information — defining the compliance split
Art. 53(1)(b) requires GPAI providers to give downstream providers the information they need to meet their own EU AI Act obligations. For NeuralAPI, this meant understanding what its 2,000+ API customers were building — and what compliance information they might need. NeuralAPI could not perform due diligence on every downstream use case. Its solution was a layered information approach: a public model card covering general capabilities and limitations; a developer documentation section on common application types and their compliance implications; and a dedicated enterprise compliance pack for customers building systems that might themselves be high-risk under Annex III.
4. Swiss incorporation — confirming the EU AI Act applies
NeuralAPI is incorporated in Switzerland, which is outside the EU. However, the EU AI Act applies to providers who place AI systems on the EU market or put them into service in the EU — regardless of where the provider is established (Art. 2(1)(c)). With 2,000+ EU developer customers actively using the API to build EU-market products, there was no credible argument that NeuralAPI was outside the Act's scope. NeuralAPI appointed an EU AI Act authorised representative based in Brussels, as required by Art. 54 for non-EU providers.
Compliance Timeline
NeuralAPI's Path to the August 2025 Deadline
Initial scoping and GPAI classification analysis
External legal counsel engaged. Confirmed GPAI classification and standard (non-systemic) tier based on 2×10²⁴ FLOPs training compute. Compliance working group formed.
Training data audit and copyright review
Head of Data led a full audit of training data sources. 3 datasets removed due to unclear TDM licensing. TDM opt-out detection process designed for future training runs.
Technical documentation drafting begins (Art. 53(1)(a))
VP Engineering led documentation of model architecture, training methodology, and evaluation framework. AI safety consultant engaged to design benchmark evaluation suite across 12 tasks.
Code of Practice — early adopter sign-up
NeuralAPI joined the GPAI Code of Practice as an early adopter, before the August 2025 deadline. Participation covers transparency, copyright, and safety commitments.
Copyright compliance policy published (Art. 53(1)(c))
TDM opt-out detection integrated into the crawl pipeline. Copyright compliance policy published on company website. Licensing review of all third-party datasets completed.
Benchmark evaluations completed
12-task benchmark evaluation suite run across reasoning, coding, multilingual performance, instruction following, and safety (refusal behaviour). Results documented in technical file.
EU authorised representative appointed (Art. 54)
Written mandate executed with Brussels-based regulatory consultancy. Representative contact details published on website and in API documentation.
Training data summary published (Art. 53(1)(d))
Public training data summary published at neuralapi.com/model/transparency. Covers data categories, language distribution, approximate token counts, and filtering criteria.
Developer documentation and model card updated (Art. 53(1)(b))
Model card published in full. API documentation updated with capability assessments. Enterprise compliance pack created for downstream high-risk AI builders.
Technical documentation finalised and internally approved
67-page technical documentation package signed off by VP Engineering and General Counsel. Stored securely; available to AI Office on request.
AI Office registration submitted — deadline met
Registration information submitted to AI Office portal on 1 August 2025, one day before the mandatory 2 August deadline. Awaiting registration number confirmation.
Lessons Learned
What NeuralAPI Would Do Differently
Implement TDM opt-out detection before training
Retrofitting copyright compliance after training has already occurred is deeply uncomfortable. You cannot un-train a model. Any company planning to train on web data should implement TDM opt-out detection in the crawl pipeline before the first byte of training data is collected — not as a post-hoc audit.
The model card is a living document
NeuralAPI's initial model card was treated as a static document. Every model update, fine-tune, or capability change requires a model card revision. NeuralAPI now treats model card updates as a mandatory step in the model release process — the same way software teams treat release notes.
Join the Code of Practice early
Early adopter status gave NeuralAPI access to draft guidance and peer discussions with other GPAI providers before the AI Office's formal positions solidified. This significantly reduced uncertainty about what “sufficient detail” meant for technical documentation. If you provide a GPAI model, joining the Code of Practice early is worthwhile even if participation is voluntary.
Appoint an EU representative early
As a Swiss company, NeuralAPI needed an EU authorised representative under Art. 54. Finding a suitable representative — one who understood AI and could genuinely engage with the AI Office — took longer than expected. Non-EU providers should identify and appoint their representative at the start of the compliance programme, not at the end.
Key Articles
Primary Legal References
Read the GPAI guide
Understand the full Art. 53 and Art. 55 obligation sets for standard and systemic-risk GPAI model providers.
GPAI Guide →Look up GPAI definitions
See how the EU AI Act defines “GPAI model”, “systemic risk”, “provider”, and related terms.
Definitions →