The Digital Services Act (DSA) details a range of actions to promote transparency and accountability of online services, without hindering innovation and competitiveness.
Transparency reports
Starting from 17 February 2024, and at least once a year, all providers of intermediary services will have to publish reports on their content moderation. The reports will include information on their content moderation practices, such as the number of orders they received from all relevant national judicial or administrative authorities, of what measures their content moderation practices consist, the number of pieces of content taken down, and the accuracy and rate of error of their automated content moderation systems. In addition, hosting providers need to provide the number of notices platforms received from users and trusted flaggers whereas online platforms also have to provide information on out-of-court dispute settlements and the number of suspensions imposed on users for misusing their services.
In view of additional risks relating to their activities, their size, and impact and of their additional obligations under this Regulation, further transparency requirements apply to VLOPs and VLOSEs. In particular, VLOPs and VLOSEs must publish their transparency reports at least every six months. The reports must also include information on their content moderation teams, including their qualifications and linguistic expertise.
VLOPs and VLOSEs have to publish their first reports including the additional transparency requirements at the latest six months after their designation.
Please find an overview below, noting that not all services keep their previous reports online:
The VLOPS and VLOSES designated on 25 April 2023 published their first set of reports at the end of October 2023 and the second round at the end of April 2024.
AliExpress | |||
Amazon | October 2023 | April 2024* | October 2024 |
Apple Store | October 2023 | April 2024 | October 2024 |
Bing | October 2023 | April 2024 | October 2024 |
Booking.com | October 2023 | April 2024 | October 2024 |
Google Services | October 2023 | April 2024 | October 2024 |
October 2023 | April 2024 | October 2024 | |
October 2023 | April 2024 | October 2024** | |
October 2023 | April 2024** | October 2024** | |
October 2023 | April 2024 | October 2024 | |
Snapchat | October 2023 | April 2024 | October 2024 |
TikTok | |||
Zalando | October 2023 | April 2024 | October 2024 |
Wikipedia | October 2023 | April 2024 | October 2024 |
X | November 2023 | April 2024 | October 2024 |
*The webpage opens with the latest report, but previous reports are available at the bottom of the webpage.
**Reports for both Facebook and Instagram are downloadable under the heading "EU Digital Services Act: Transparency Reports for Very Large Online Platforms".
The services designated in December 2023 (Pornhub, Stripchat, and XVideos) had to publish their first transparency reports by the end of June 2024.
Pornhub | June 2024 | |
Stripchat | June 2024 | |
XVideos | June 2024 |
Shein was designated in April 2024 and the deadline for their first transparency report was October 2024. Similarly, Temu was designated in May 2024 and the deadline for their first transparency report was November 2024.
Shein | October 2024 |
Temu | November 2024 |
These reports continue to provide valuable insights into the practices and outcomes of content moderation by these major digital players.
On November 2024, the commission adopted an implementing regulation laying down templates for the transparency reports, including harmonised reporting periods. The implementing regulation also provides guidance to fill the templates and ensures that the transparency reports of the providers are comparable.
Access the regulation, the templates and instructions
Publication of monthly average user numbers
The DSA requires all providers of online platforms, except small and micro-enterprises, to publish the number of monthly users of their services in the EU and to update it every 6 months. This obligation took effect on 17 February 2023. The data coming from this self-assessment was fundamental to designate the first Very Large Operating Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) and will continue to be important in monitoring the landscape of intermediary service providers. The Commission may use also other information available to it when monitoring the number of monthly users of platform services in the EU.
DSA Transparency Database & Statements of reasons
Since the end of August, Very Large Online Platforms (VLOPs) need to inform users whenever they remove or otherwise restrict access to their content and explain the reasons behind each moderation decision to the affected users in a “statement of reasons.” Statements of reasons must contain clear and specific information, spelling out the reason(s), and a legal or terms of service reference based on which content was removed or restricted. Once the DSA is fully in force in February 2024, the obligation to provide users with statements of reasons explaining content moderation decisions will apply to all hosting services.
On September 25, the Commission launched the DSA Transparency Database, which collects and makes publicly available statements of reasons almost in real time to enable scrutiny over the content moderation decisions of providers of online platforms. Everyone can access the DSA Transparency Database website and search for, read, and download the statements of reasons published there.
Before submitting their statements of reasons to the DSA Transparency Database, online platforms, currently VLOPs only, need to ensure that they do not contain personal data.
Once the DSA is fully in force on 17 February 2024, all online platforms will need to submit all their statements of reasons to the DSA Transparency Database.
Access the DSA Transparency Database
Data Access for researchers, the Digital Service Coordinators, and the Commission
The DSA sets out the obligation for VLOPs and VLOSEs to give access to certain data for the purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the EU, as well as to the assessment of adequacy, efficiency and impacts of risk mitigation measures taken by those providers.
Data access will be possible for:
- Digital Services Coordinators of the Member State where a provider of a VLOP or VLOSE is established and the Commission for the purpose of monitoring and assessing compliance with the DSA.
- Vetted researchers who fulfil the criteria set out in the Regulation, following an assessment by the relevant Digital Service Coordinator. Researchers will benefit from access to data, often including previously undisclosed or under-disclosed data, opening up new avenues for research, and increasing the potential of generating knowledge for the benefit of all.
- Researchers to gain access to publicly available data, if they meet the relevant conditions (such as independence from commercial interests, or the ability to uphold adequate security standards). Such researchers should already today be able to gain access to VLOPs/VLOSEs publicly accessible data without the intermediation of Digital Service Coordinators to conduct research that contributes to the detection, identification and understanding of systemic risks in the Union.
To further specify the data access mechanism for vetted researchers, the Commission is working on a delegated act. It will be finalised after a public consultation and once the Board of Digital Services Coordinators is consulted.
DSA Whistleblower tool
The DSA Whistleblower tool is designed to enable employees and other insiders to report harmful practices of VLOPs/VLOSEs anonymously. We welcome reports that shed light on any violations related to a wide range of areas including content moderation, recommender systems, advertising, risk assessment, public security, civic discourse, and children’s rights. These reports can take various forms such as official reports, memos, email exchanges, data metrics, or any other context that provides relevant information.
Risk assessment and audit reports
At least once a year, VLOPs and VLOSEs must identify and analyse risks stemming from their services, such as the dissemination of illegal content, disinformation and risks to the protection of minors. They must also outline the measures they have put in place to mitigate the identified risks.
VLOPs and VLOSEs must also be subject to audits at least once a year. At the latest three months after the receipt of the audit reports from the independent auditor, VLOPs and VLOSEs must publish their audit report together with the risk assessment reports, the implemented risk mitigation measures, a report on how they implement recommendations received from their auditors, and information about any consultations that supported the risk assessment and risk mitigation measures they put in place.
The mandatory disclosure of all these reports helps bring about further transparency and accountability and offers a basis for public scrutiny.
In October 2023, the Commission published a delegated act detailing the framework for the preparation and issuance of audit reports and audit implementation reports.
VLOP/VLOSE |
Risk assessment report |
Audit report |
Audit implementation report |
AliExpress |
|||
Amazon |
|||
Apple |
|||
Bing |
|||
Booking |
|||
|
|||
|
2023* |
||
|
|||
|
|||
|
|||
Snapchat |
|||
TikTok |
|||
Wikipedia |
|||
X |
|||
Zalando |
*Both the risk assessment report and audit implementation report cover all VLOPs and VLOSE under Google.
Digital Services Terms and Conditions Database
Online terms and conditions can be a source of confusion and frustration. They can be untransparent or unspecific, and prone to frequent changes, leaving users and businesses in the dark about their rights and obligations. This database leverages transparency obligations for digital services stemming from the DSA and the P2B Regulation to bring alleviation to this issue. It stores the terms and conditions of over 790 terms and conditions from more than 400 services. These entries include a variety of documents like Commercial Terms, Developer Terms, Live Policy, Terms of Service, Privacy Policy, and more. The database reviews the contracts in its repository multiple times per day across the web and automatically highlights new changes, enabling regulators, researchers, and users to keep up with the evolving digital landscape effortlessly.
Related Content
Big Picture
The Digital Services Act and Digital Markets Act aim to create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses.
See Also
Under DSA, trusted flaggers are responsible for detecting potentially illegal content and alert online platforms. They are entities designated by the national Digital Services Coordinators.
The European Board for Digital Services is an independent advisory group that has been established by the Digital Services Act, with effect from 17 February 2024.
The DSA (Digital Services Act) whistleblower tool allows employees and other insiders to report harmful practices of Very Large Online Platforms and Search Engines (VLOPs/VLOSEs)
Digital Services Coordinators help the Commission to monitor and enforce obligations in the Digital Services Act (DSA).
This page provides an overview of the designated Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) supervised by the Commission and the main enforcement activities.
Since August 2023, platforms have already started to change their systems and interfaces according to the Digital Services Act (DSA) to provide a safer online experience for all.
The enforcement of the Digital Services Act (DSA) includes a full set of investigative and sanctioning measures that can be taken by national authorities and the Commission.
The Digital Services Act (DSA) provides a framework for cooperation between the Commission, EU and national authorities to ensure platforms meet its obligations.
Very large online platforms and search engines are those with over 45 million users in the EU. They must comply with the most stringent rules of the DSA.
Find out how the DSA can make the online world safer and protect your fundamental rights.
The European Centre for Algorithmic Transparency (ECAT) is committed to improved understanding and proper regulation of algorithmic systems.