Skip to main contentSkip to main content
GPAI — Art. 53GPAI ProviderArt. 53 · 56Deadline: 2 Aug 2025 — passed

LLM API Provider (GPAI)

How NeuralAPI SA, a Swiss company providing a large language model API to EU developers, navigated general-purpose AI (GPAI) obligations under Art. 53 — covering technical documentation, copyright compliance, downstream provider information, and AI Office registration.

Company

NeuralAPI SA

Jurisdiction

Switzerland (Geneva)

Employees

85

GPAI classification

Standard GPAI (Art. 53)

Company Profile

About NeuralAPI SA

NeuralAPI SA trains and operates a proprietary large language model, exposed to the market as a REST API. The API is used by 2,000+ EU-based developers and businesses to build applications ranging from document summarisation and code assistance to customer service chatbots and internal knowledge management tools. NeuralAPI has no consumer-facing product of its own — its customers are developers and businesses (downstream providers and deployers) who integrate the API into their own products. Although NeuralAPI is incorporated in Switzerland, it places its model on the EU market and is therefore subject to the EU AI Act's GPAI provisions.

Business model

  • • API product — pay-per-token pricing with monthly subscription tiers
  • • 2,000+ active EU developer and business customers
  • • Revenue: API usage fees + enterprise contracts
  • • Markets: DE, FR, NL, SE, PL, IT, ES, and 14 other EU member states

Model profile

  • • Transformer-based LLM, 34B parameters
  • • Training compute: 2×10²⁴ FLOPs
  • • Training data: multilingual web corpus, books, code (2020–2024)
  • • Context window: 128K tokens; supports text generation, classification, embeddings

EU AI Act — GPAI Classification

Standard GPAI — Not Systemic Risk

Threshold analysis

The EU AI Act designates GPAI models with systemic risk as those trained using total compute exceeding 10²⁵ FLOPs (Art. 51(1)(a)) or those otherwise determined to have high-impact capabilities by the AI Office (Art. 51(1)(b)). NeuralAPI's model was trained using approximately 2×10²⁴ FLOPs — roughly one-fifth of the systemic risk threshold. NeuralAPI therefore falls under the standard GPAI provider obligations in Art. 53, not the additional systemic risk obligations in Art. 55. The compliance deadline for all GPAI providers was 2 August 2025.

NeuralAPI — standard GPAI (Art. 53)

  • • Training compute: 2×10²⁴ FLOPs
  • • Obligations: Art. 53(1)(a–d)
  • • Not applicable: adversarial testing, incident reporting, AI Office model evaluation
  • • Compliance deadline: 2 August 2025

Systemic risk threshold (not applicable here)

  • • Threshold: >10²⁵ FLOPs (Art. 51(1)(a))
  • • Additional obligations: Art. 55
  • • Includes: adversarial testing, incident reporting, cybersecurity
  • • NeuralAPI is approximately 5× below this threshold

Art. 53 Obligations

Compliance Checklist — NeuralAPI's Status

Art. 53(1)(a) — technical documentationMaintain technical documentation including model architecture, training methodology, evaluation results, and capabilities

67-page technical documentation package produced covering: transformer architecture specification, pre-training and fine-tuning methodology, training data composition and sources, evaluation results across 12 benchmark tasks, known limitations and refusal behaviour, and intended use cases. Held internally; available to AI Office on request.

Art. 53(1)(b) — downstream provider informationProvide downstream providers with information and documentation enabling compliance with their own obligations

Comprehensive model card published in API documentation covering capabilities, limitations, and appropriate use cases. API documentation updated with capability assessments for common downstream application types. Developer agreement updated to clarify provider/deployer compliance responsibilities. Enterprise compliance pack created for customers building Annex III high-risk systems.

Art. 53(1)(c) — copyright compliance policyImplement a policy to comply with EU copyright law, including respect for text and data mining (TDM) opt-outs

Copyright compliance policy published. TDM opt-out detection process implemented: web crawl pipeline now checks and honours robots.txt TDM directives and dedicated TDM reservation notices (per DSM Directive Art. 4(3)). Licensing review of all third-party datasets completed; 3 datasets removed due to unclear licensing terms.

Art. 53(1)(d) — training data summaryPublish a sufficiently detailed summary of the training data used

Public training data summary published at neuralapi.com/model/transparency. Covers: data categories (web text, books, code, multilingual data), language distribution, approximate token counts per category, data collection period (2020–2024), and filtering and quality criteria applied. Level of detail calibrated against AI Office draft guidance; specific source URLs withheld on competitive and legal grounds.

~
Art. 56 — AI Office registrationRegister GPAI model with the EU AI Office

AI Office registration portal opened in phases from mid-2025. NeuralAPI submitted initial registration information on 1 August 2025 — one day before the mandatory deadline. Awaiting confirmation of registration number. Full technical documentation package prepared for submission on request.

Art. 54 — EU authorised representativeNon-EU providers must appoint an EU-established authorised representative

Written mandate executed with a Brussels-based regulatory consultancy appointed as EU authorised representative in April 2025. Representative contact details published on company website, in API documentation, and in technical documentation.

Code of Practice — voluntary participationParticipation in the GPAI Code of Practice process overseen by the AI Office

NeuralAPI joined as an early adopter signatory to the GPAI Code of Practice in March 2025, before the August 2025 compliance deadline. Participation covers transparency, copyright, and safety commitments. Early adoption provided access to draft AI Office guidance and created a rebuttable presumption of Art. 53 compliance.

Complete~ In progress Not started

Internal Organisation

How NeuralAPI Structured Its Compliance Team

As an 85-person company, NeuralAPI could not justify a dedicated legal department. The GPAI compliance programme was led by a cross-functional working group of four people: the VP of Engineering (technical documentation and architecture), the Head of Data (training data summary and copyright policy), the General Counsel (legal analysis, Code of Practice, AI Office registration), and the Head of Product (developer documentation and model card). The group met fortnightly from January 2025 and was supported by two external advisers: a Geneva-based technology law firm with EU AI Act specialisation, and an independent AI safety consultant for benchmark evaluation design.

Internal team

  • • VP Engineering — technical documentation lead
  • • Head of Data — training data summary and copyright
  • • General Counsel — legal analysis and AI Office liaison
  • • Head of Product — model card and developer docs

External support

  • • Technology law firm — EU AI Act and copyright legal advice
  • • AI safety consultant — benchmark evaluation design
  • • Estimated external legal spend: ~€35,000 over 8 months
  • • Internal time: ~0.5 FTE equivalent across working group

Key Challenges

What Made Compliance Difficult

1. What counts as “sufficient detail” in technical documentation

Art. 53(1)(a) requires technical documentation, but the Act does not specify a precise format or minimum content standard for standard GPAI providers (unlike the detailed Annex IV template applicable to high-risk systems). NeuralAPI's legal team reviewed the AI Office's draft guidance and the GPAI Code of Practice documentation templates to calibrate the expected level of detail. The key tension was between regulatory sufficiency and competitive sensitivity: disclosing too much about training data composition or model architecture could disadvantage NeuralAPI commercially. The final 67-page technical file was designed to fully satisfy the regulatory floor while limiting disclosure of genuinely proprietary methodology.

2. Retroactive copyright compliance for training data

The EU's DSM Directive (Art. 4) allows text and data mining for commercial purposes unless rightholders have expressly reserved this right. Implementing a robust TDM opt-out detection process retroactively — after the model had already been trained — raised difficult questions. NeuralAPI could not un-train on data that had been ingested, but it could: (1) document its good-faith crawl practices at the time of training, (2) publish a clear policy for future training runs, and (3) remove datasets from the documented training corpus where licensing was unclear. Three third-party datasets were removed from the training data summary after the licensing review found terms inconsistent with commercial TDM use.

3. Downstream provider information — defining the compliance split

Art. 53(1)(b) requires GPAI providers to give downstream providers the information they need to meet their own EU AI Act obligations. For NeuralAPI, this meant understanding what its 2,000+ API customers were building — and what compliance information they might need. NeuralAPI could not perform due diligence on every downstream use case. Its solution was a layered information approach: a public model card covering general capabilities and limitations; a developer documentation section on common application types and their compliance implications; and a dedicated enterprise compliance pack for customers building systems that might themselves be high-risk under Annex III.

4. Swiss incorporation — confirming the EU AI Act applies

NeuralAPI is incorporated in Switzerland, which is outside the EU. However, the EU AI Act applies to providers who place AI systems on the EU market or put them into service in the EU — regardless of where the provider is established (Art. 2(1)(c)). With 2,000+ EU developer customers actively using the API to build EU-market products, there was no credible argument that NeuralAPI was outside the Act's scope. NeuralAPI appointed an EU AI Act authorised representative based in Brussels, as required by Art. 54 for non-EU providers.

Compliance Timeline

NeuralAPI's Path to the August 2025 Deadline

Oct 2024

Initial scoping and GPAI classification analysis

External legal counsel engaged. Confirmed GPAI classification and standard (non-systemic) tier based on 2×10²⁴ FLOPs training compute. Compliance working group formed.

Nov 2024

Training data audit and copyright review

Head of Data led a full audit of training data sources. 3 datasets removed due to unclear TDM licensing. TDM opt-out detection process designed for future training runs.

Dec 2024

Technical documentation drafting begins (Art. 53(1)(a))

VP Engineering led documentation of model architecture, training methodology, and evaluation framework. AI safety consultant engaged to design benchmark evaluation suite across 12 tasks.

Jan 2025

Code of Practice — early adopter sign-up

NeuralAPI joined the GPAI Code of Practice as an early adopter, before the August 2025 deadline. Participation covers transparency, copyright, and safety commitments.

Feb 2025

Copyright compliance policy published (Art. 53(1)(c))

TDM opt-out detection integrated into the crawl pipeline. Copyright compliance policy published on company website. Licensing review of all third-party datasets completed.

Mar 2025

Benchmark evaluations completed

12-task benchmark evaluation suite run across reasoning, coding, multilingual performance, instruction following, and safety (refusal behaviour). Results documented in technical file.

Apr 2025

EU authorised representative appointed (Art. 54)

Written mandate executed with Brussels-based regulatory consultancy. Representative contact details published on website and in API documentation.

May 2025

Training data summary published (Art. 53(1)(d))

Public training data summary published at neuralapi.com/model/transparency. Covers data categories, language distribution, approximate token counts, and filtering criteria.

Jun 2025

Developer documentation and model card updated (Art. 53(1)(b))

Model card published in full. API documentation updated with capability assessments. Enterprise compliance pack created for downstream high-risk AI builders.

Jul 2025

Technical documentation finalised and internally approved

67-page technical documentation package signed off by VP Engineering and General Counsel. Stored securely; available to AI Office on request.

Aug 2025

AI Office registration submitted — deadline met

Registration information submitted to AI Office portal on 1 August 2025, one day before the mandatory 2 August deadline. Awaiting registration number confirmation.

Lessons Learned

What NeuralAPI Would Do Differently

Implement TDM opt-out detection before training

Retrofitting copyright compliance after training has already occurred is deeply uncomfortable. You cannot un-train a model. Any company planning to train on web data should implement TDM opt-out detection in the crawl pipeline before the first byte of training data is collected — not as a post-hoc audit.

The model card is a living document

NeuralAPI's initial model card was treated as a static document. Every model update, fine-tune, or capability change requires a model card revision. NeuralAPI now treats model card updates as a mandatory step in the model release process — the same way software teams treat release notes.

Join the Code of Practice early

Early adopter status gave NeuralAPI access to draft guidance and peer discussions with other GPAI providers before the AI Office's formal positions solidified. This significantly reduced uncertainty about what “sufficient detail” meant for technical documentation. If you provide a GPAI model, joining the Code of Practice early is worthwhile even if participation is voluntary.

Appoint an EU representative early

As a Swiss company, NeuralAPI needed an EU authorised representative under Art. 54. Finding a suitable representative — one who understood AI and could genuinely engage with the AI Office — took longer than expected. Non-EU providers should identify and appoint their representative at the start of the compliance programme, not at the end.

Key Articles

Primary Legal References

Art. 3(63)Definition of general-purpose AI model — broad capability to perform a wide range of distinct tasks
Art. 51(1)(a)Systemic risk threshold — training compute exceeding 10²⁵ FLOPs triggers Art. 55 additional obligations
Art. 53(1)(a)Technical documentation obligation — architecture, training methodology, capabilities, and evaluations
Art. 53(1)(b)Downstream provider information — model card, API documentation, capability assessments
Art. 53(1)(c)Copyright compliance policy — TDM opt-out detection and licensing review
Art. 53(1)(d)Training data summary — publicly available summary of training data categories and composition
Art. 54Authorised representative — required for non-EU providers placing GPAI models on the EU market
Art. 56AI Office registration — GPAI providers must register their models with the EU AI Office
DSM Directive Art. 4Text and data mining — commercial TDM permitted unless rightholder has expressly opted out

Read the GPAI guide

Understand the full Art. 53 and Art. 55 obligation sets for standard and systemic-risk GPAI model providers.

GPAI Guide →

Look up GPAI definitions

See how the EU AI Act defines “GPAI model”, “systemic risk”, “provider”, and related terms.

Definitions →