Skip to main content
Shaping Europe’s digital future

General-purpose AI obligations under the AI Act

General-purpose AI models (GPAI) are the foundation for many different AI systems. To ensure safe and transparent AI, the AI Act sets obligations for all providers of GPAI models and additional obligations for providers of the most advanced/impactful models (i.e. GPAI models with systemic risk). These obligations enter into application on 2 August 2025.

100%
0
Top
""

Obligations for all providers of GPAI models

  • Draw up technical documentation
  • Implement a copyright policy
  • Publish a summary of the model's training content
50%
0
Top
..

Obligations for providers of GPAI models with systemic risk

  • Notify the Commission
  • Risk assessment and mitigation
  • Incident reporting
  • Cybersecurity protections
50%
0
Top

The Commission introduced 2 documents to help providers of GPAI models fulfil their obligations under the AI Act:

100%
0
Top
..

Guidelines on the scope of the obligations for providers of GPAI models

50%
0
Top
..

Template for the public summary of training content for GPAI models

50%
0
Top
..

The Commission and the Member States also confirmed that the GPAI Code of Practice, developed by independent experts, is an adequate voluntary tool for providers of GPAI models to demonstrate compliance with their obligations under the AI Act.

100%
0
Top

The Commission’s GPAI guidelines clarify who must comply with the AI Act obligations, while the voluntary Code of Practice and the mandatory template for public summary of training content for GPAI models clarify what to do to comply.

100%
0
Top
..

 

Documents relevant to all providers of GPAI models*

50%
0
Left
  • Guidelines on the scope of the obligations for providers of GPAI models
  • Template for the public summary of training content for GPAI models
  • Transparency and Copyright chapters of the GPAI Code of Practice
50%
0
Top

*Models are considered GPAI if they are trained with more than 10^23 FLOP and can generate language, as set out in the Guidelines.

100%
0
Top
..

 

Documents relevant only to providers of GPAI models with systemic risk**

 

50%
0
Left

 

  • Safety and Security chapter of the GPAI Code of Practice

 

50%
0
Top

**GPAI models are presumed to pose systemic risk if they are trained with more than 10^25 FLOP, as set out in the AI Act. This threshold is currently under review.

100%
0
Top

Resources and contact

..

Providers who need to notify the Commission of a GPAI model with systemic risk should contact EU-AIOFFICEGPAI-SR-PROVIDERS@ec.europa.eu. The AI Office has practical guidance available.

50%
0
Top