Since August 2023, platforms have already started to change their systems and interfaces according to the Digital Services Act (DSA) to provide a safer online experience for all.
The DSA significantly improves the mechanisms for the removal of illegal content and for the effective protection of users’ fundamental rights online, including freedom of speech. It also creates a stronger public oversight mechanism for online platforms, in particular for those Very Large Online Platforms (VLOPs) which reach more than 10% of the EU’s population.
Easier reporting of illegal content
The DSA requires platforms to put in place measures to counter the spreading of illegal goods, services or content online, such as mechanisms for users to flag such content and for platforms to cooperate with “trusted flaggers”. Those with the sharpest eyes may already have noticed some changes to the online environment, such as X’s new feature to report illegal content. This feature is available if you click on the three small dots in the top right corner of each post. X is not the only platform to have implemented new and user-friendly options to flag illicit content. Apple, Pinterest, Facebook, Instagram and TikTok have also put in place new and easy ways to report illegal content on their platforms.
Greater transparency in content moderation and more options to appeal
Online platforms are a digital space where we express ourselves, showcase our work, and are in contact with friends or customers. This is why it is particularly frustrating when our content gets removed, or the reach of our posts is inexplicably reduced. With the DSA, providers of intermediary services, including online platforms, must communicate to their users why they have removed their content, or why access to an account has been restricted. Providers of hosting services, including online platforms, now have an express legal obligation to provide clear and specific statements of reasons for their content moderation decisions. The DSA also empowers users to challenge such decisions through an out-of-court dispute settlement mechanism.
Even online platforms that already offered information on their content moderation decisions, such as Facebook and Instagram, are now offering a greater range of content moderation information.
Moreover, to ensure transparency and enable scrutiny over the content moderation decisions of providers of online platforms as required by the DSA, the Commission launched the DSA Transparency Database. The DSA Transparency Database is a first-of-its-kind database which makes all statements of reason provided by providers of online platforms for their content moderation decision accessible to the public. VLOPs must publish their statements of reason starting from the end of August, while the deadline for publication for all other providers of online platforms falling within scope of the DSA is 17 February 2024.
More knowledge and choice over what we see – and more control on personalisation options
The DSA obliges providers of online platforms to guarantee greater transparency and control on what we see in our feeds. This should allow us to discover on what basis online platforms rank content on our feeds and to decide whether we want to opt out of personalised recommendations, since VLOPs must offer an option to turn off personalised content. Similar obligations apply to ads: in addition to further transparency and control on why we see a certain advertisement on our feed, platforms must label ads and VLOPs must maintain a repository with details on paid advertisement campaigns run on their online interfaces.
TikTok, Facebook and Instagram currently offer an option to disable the personalised feed on their platforms.
Member States and the Commission will supervise and enforce in close cooperation the provisions related to the transparency of recommender systems and advertisement, including the implementation of ads repositories.
Zero tolerance on targeting ads to children and teens and on targeting ads based on sensitive data
The DSA bans targeted advertisement to minors on online platforms. VLOPs have taken steps to comply with these prohibitions. For example, Snapchat, Alphabet's Google and YouTube, and Meta’s Instagram and Facebook no longer allow advertisers to show targeted ads to underage users. TikTok and YouTube now also set the accounts of users under 16 years old to private by default.
Targeted advertisement on online platforms is also prohibited when profiling uses special categories of personal data, such as ethnicity, political views, sexual orientation.
Protection for children
According to the new rules, online platforms that are accessible to children should protect the privacy and security of those users, as well as their mental and physical well-being, for instance by adopting special privacy and security settings by default. Online platforms may put in place age verification measures to control who can access their services, parental controls so that parents and guardians can help protect their children against the risk of exposure to harmful content, and tools where users can signal abuse or get support. Some first steps are being taken.
For example, TikTok and YouTube, in addition to banning targeted ads for minors, have set the profiles of minors automatically to private, which means that the videos they upload can only be viewed by the people they approve.
Integrity of elections
The DSA requires VLOPs and VLOSEs to identify, analyse, and mitigate with effective measures risks related to the electoral processes and civic discourse, while ensuring protection of freedom of expression. The Slovak parliamentary elections of 30 September 2023 formed the first test case for these requirements after the entry into application of the DSA for VLOPs and VLOSEs. Thanks to the DSA, the approach of providers of VLOPs and VLOSEs to electoral integrity has changed. Progress has been made with regard to shorter response times to flagging by the local authorities and trusted partners, clearer escalation processes of disinformation and misinformation, an increase in fact-checking capabilities, and an overall increase in resources and capacities.
New obligations on traceability of business users in online marketplaces
The DSA introduces new obligations for providers of online marketplaces to counter the spread of illegal goods. In particular, such providers must now ensure that sellers provide verified information on their identity before they can start selling their goods on those online marketplaces. Such providers must also guarantee that users can easily identify the person responsible for the sale. Moreover, if a provider of online marketplace becomes aware of the selling of an illegal product or service by a seller, it must inform the users who purchased the illegal good or product, as well as the identity of the seller and the options for redress.
The Commission, along with Member States, will supervise and enforce compliance with the obligations against the spread of illegal goods on online marketplaces.
These are some of the changes that we, as users, can notice immediately in our everyday digital interactions.
The Commission’s work as enforcer has just begun. We will be checking whether these first steps are genuine and what impact they have on the safety of users and society as a whole. We will use the means we have to create a better digital reality and future. Just like we would reject it in the offline world, harassment, bullying, and illegal content have no place on online platforms that have become integral tools of our daily lives and those of our children. More changes are coming, and the monitoring of the application of the measures introduced by the DSA has just started. The goal, however, has not changed: a safer, more inclusive digital world for all.
Related Content
Big Picture
The Digital Services Act and Digital Markets Act aim to create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses.
See Also
Under DSA, trusted flaggers are responsible for detecting potentially illegal content and alert online platforms. They are entities designated by the national Digital Services Coordinators.
The European Board for Digital Services is an independent advisory group that has been established by the Digital Services Act, with effect from 17 February 2024.
The DSA (Digital Services Act) whistleblower tool allows employees and other insiders to report harmful practices of Very Large Online Platforms and Search Engines (VLOPs/VLOSEs)
Digital Services Coordinators help the Commission to monitor and enforce obligations in the Digital Services Act (DSA).
The Digital Services Act (DSA) details a range of actions to promote transparency and accountability of online services, without hindering innovation and competitiveness.
This page provides an overview of the designated Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) supervised by the Commission and the main enforcement activities.
The enforcement of the Digital Services Act (DSA) includes a full set of investigative and sanctioning measures that can be taken by national authorities and the Commission.
The Digital Services Act (DSA) provides a framework for cooperation between the Commission, EU and national authorities to ensure platforms meet its obligations.
Very large online platforms and search engines are those with over 45 million users in the EU. They must comply with the most stringent rules of the DSA.
Find out how the DSA can make the online world safer and protect your fundamental rights.
The European Centre for Algorithmic Transparency (ECAT) is committed to improved understanding and proper regulation of algorithmic systems.