The Digital Services Act (DSA) details a range of actions to promote transparency and accountability of online services, without hindering innovation and competitiveness.
Transparency reports
Starting from 17 February 2024, and at least once a year, all providers of intermediary services will have to make reports on their content moderation publicly available. The reports will include information on their content moderation practices, such as the number of orders they received from all relevant national judicial or administrative authorities, of what measures their content moderation practices consist, the number of pieces of content taken down, and the accuracy and rate of error of their automated content moderation systems. In addition, hosting providers need to provide the number of notices platforms received from users and trusted flaggers whereas online platforms also have to provide information on out-of-court dispute settlements and the number of suspensions imposed on users for misusing their services.
In view of additional risks relating to their activities, their size, and impact and of their additional obligations under this Regulation, further transparency requirements apply to very large online platforms (VLOPs) and search engines (VLOSEs). In particular, VLOPs and VLOSEs must publish their transparency reports at least every six months. The reports must also include information on their content moderation teams, including their qualifications and linguistic expertise.
VLOPs and VLOSEs have to publish their first reports including the additional transparency requirements at the latest six months after their designation.
In November 2024, the Commission adopted an Implementing Regulation that standardises the format, content, and reporting periods for transparency reports, ensuring providers of intermediary services report in a harmonised way.
So far, the transparency reports were widely heterogeneous in their format and content moderation categories, while the reporting periods for providers of VLOPs and VLOSEs were not aligned, ultimately depending on their dates of designation.
With the Implementing Regulation, in force since 1 July 2025, and the introduction of a machine-readable templates for companies to report, these reports are now standardised and comparable across reporting periods and platforms. The first harmonised reports were published in February 2026.
Please find an overview below, noting that not all services keep their previous reports online.
The transparency reports of the VLOPs and VLOSEs are added on the following tables based on the timing of their designation. Use the slider at the bottom to see older reports.
*The webpage opens with the latest report, but previous reports are available at the bottom of the webpage.
** Clicking downloads a ZIP file.
***Reports for both Facebook and Instagram are downloadable under the heading "EU Digital Services Act: Transparency Reports for Very Large Online Platforms".
Warning: Clicking on the following links will redirect you to adult content websites. Viewer discretion is advised.
| Pornhub | February 2026 | August 2025 | February 2025 | August 2024 |
| Stripchat | February 2026 | June 2025 | December 2024 | June 2024 |
| XVideos | February 2026 | August 2025 | February 2025 | June 2024 |
Warning: Clicking on the XNXX report will redirect you to an adult content website. Viewer discretion is advised.
| Shein | ||||
| Temu | February 2026 | August 2025 | February 2025 | November 2024 |
| XNXX | February 2026 | August 2025 | January 2025 | |
*Reports from Shein are available under point "4.Transparency reporting obligations".
Access the regulation, the templates and instructions
Publication of monthly average user numbers
The DSA requires all providers of online platforms, except small and micro-enterprises, to publish the number of monthly users of their services in the EU and to update it every 6 months. This obligation took effect on 17 February 2023. The data coming from this self-assessment was fundamental to designate the first Very Large Operating Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) and will continue to be important in monitoring the landscape of intermediary service providers. The Commission may use also other information available to it when monitoring the number of monthly users of platform services in the EU.
DSA Transparency Database & Statements of reasons
The DSA requires from all hosting services to inform users whenever they remove or otherwise restrict access to their content and explain the reasons behind each moderation decision to the affected users in a “statement of reasons.” Statements of reasons must contain clear and specific information, spelling out the reason(s), and a legal or terms of service reference based on which content was removed or restricted. Before submitting their statements of reasons the DSA Transparency Database, online platforms need to ensure that they do not contain personal data.
On September 25 2023, the Commission launched the DSA Transparency Database, which collects and makes publicly available statements of reasons almost in real time to enable scrutiny over the content moderation decisions of providers of online platforms. Everyone can access the DSA Transparency Database website and search for, read, and download the statements of reasons published there.
From 1 July 2025, the content moderation categories and keywords for submitting statements of reasons to the DSA Transparency Database were updated to be aligned with the data categories in the transparency reports, thus making the two transparency data sources comparable.
Access the DSA Transparency Database
Data Access for researchers, the Digital Service Coordinators, and the Commission
The DSA sets out the obligation for VLOPs and VLOSEs to give access to certain data for the purpose of conducting research that contributes to:
- the detection, identification and understanding of systemic risks in the EU
- the assessment of adequacy, efficiency and impacts of risk mitigation measures taken by those providers
Under the DSA, researchers can gain access to publicly available data, if they meet the relevant conditions (such as independence from commercial interests, or the ability to uphold adequate security standards). Researchers can access VLOPs/VLOSEs’ publicly available data without the intermediation of Digital Service Coordinators to conduct research that contributes to the detection, identification and understanding of systemic risks in the Union.
In July 2025, the Commission adopted the delegated act on data access under the Digital Services Act, which was transmitted to the Parliament and Council for a three-month scrutiny period. The act clarifies the procedures for VLOPs and VLOSEs to share internal data with vetted researchers, detailing data formats, documentation requirements, and the information that Digital Services Coordinators (DSCs), VLOPs and VLOSEs must make public to facilitate data access applications to access relevant datasets. This will empower the research community to play a crucial role in scrutinising the systemic risks posed by these platforms and contribute to safeguarding the online environment.
The Commission also launched the DSA data access portal, providing a central hub for researchers to find information and communicate with VLOPs, VLOSEs, and DSCs on their application to access internal data. Researchers seeking vetted status must demonstrate their affiliation to a research organisation, independence from commercial interests, and ability to manage data securely and privately, while also committing to publish their findings.
The Board of Digital Services has also endorsed a proposal to enhance cooperation among DSCs in the vetting process, ensuring a coordinated approach to this vital new mechanism. The rules have been tested through a pilot project led by the European Research Council.
DSA Whistleblower tool
The DSA Whistleblower tool is designed to enable employees and other insiders to report harmful practices of VLOPs/VLOSEs anonymously. We welcome reports that shed light on any violations related to a wide range of areas including content moderation, recommender systems, advertising, risk assessment, public security, civic discourse, and children’s rights. These reports can take various forms such as official reports, memos, email exchanges, data metrics, or any other context that provides relevant information.
Risk assessment and audit reports
At least once a year, VLOPs and VLOSEs must identify and analyse risks stemming from their services, such as the dissemination of illegal content, disinformation and risks to the protection of minors. They must also outline the measures they have put in place to mitigate the identified risks.
VLOPs and VLOSEs must also be subject to audits at least once a year. At the latest three months after the receipt of the audit reports from the independent auditor, VLOPs and VLOSEs must publish their audit report together with the risk assessment reports, the implemented risk mitigation measures, a report on how they implement recommendations received from their auditors, and information about any consultations that supported the risk assessment and risk mitigation measures they put in place.
The mandatory disclosure of all these reports helps bring about further transparency and accountability and offers a basis for public scrutiny.
In October 2023, the Commission published a delegated act detailing the framework for the preparation and issuance of audit reports and audit implementation reports.
|
VLOP/VLOSE |
Risk assessment report |
Audit report |
Audit implementation report |
|
AliExpress |
|||
|
Amazon |
|||
|
Apple |
|||
|
Bing |
|||
|
Booking |
|||
|
|
2024* |
||
|
|
2023** 2024** 2025** |
||
|
|
2024* |
||
|
|
|||
|
|
|||
|
Pornhub |
|||
|
Shein |
|||
|
Snapchat |
|||
|
Stripchat |
|||
|
Temu |
|||
|
TikTok |
|||
|
Wikipedia |
|||
|
X |
|||
|
XNXX |
|||
|
Xvideos |
|||
|
Zalando |
*All the reports for both Instagram and Facebook are published under Meta's Transparency Centre. Find the name of the report, select the reporting period and the platform to download the report.
**Both the risk assessment report and audit implementation report cover all VLOPs and VLOSE under Google.
Digital Services Terms and Conditions Database
Online terms and conditions can be a source of confusion and frustration. They can be untransparent or unspecific, and prone to frequent changes, leaving users and businesses in the dark about their rights and obligations. This database leverages transparency obligations for digital services stemming from the DSA and the P2B Regulation to bring alleviation to this issue. It stores the terms and conditions of over 790 terms and conditions from more than 400 services. These entries include a variety of documents like Commercial Terms, Developer Terms, Live Policy, Terms of Service, Privacy Policy, and more. The database reviews the contracts in its repository multiple times per day across the web and automatically highlights new changes, enabling regulators, researchers, and users to keep up with the evolving digital landscape effortlessly.
Related Content
Big Picture