Skip to main content
Shaping Europe’s digital future

The impact of the Digital Services Act on digital platforms

With the Digital Services Act (DSA) platforms have already changed their systems and interfaces to provide a safer online experience for all Europeans.

The description below was prepared by the Commission services for information purposes only. It neither binds the Commission, nor does it entail an endorsement or prejudice any later assessment in any way.

The DSA significantly improves the mechanisms for the removal of illegal content and for the effective protection of users’ fundamental rights online, including freedom of speech. It also creates a stronger public oversight mechanism for online platforms, particularly for Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs). These have over 45 million monthly users, reaching more than 10% of the EU’s population.  

Easier reporting of illegal content

The DSA requires platforms to put in place measures to counter the spreading of illegal goods, services or content online, such as mechanisms for users to flag such content. 

You may already have noticed some changes to the online environment, such as X’s feature to report illegal content, available if you click on the three small dots in the top right corner of each post. Apple, Pinterest, TikTok,  Facebook & Instagram have also put in place new ways to report illegal content on their platforms.

The DSA also requires platforms to treat with priority notices submitted by trusted flaggers. These are impartial experts specialised in detecting certain types of illegal content. Over 60 trusted flaggers have already been appointed by national authorities, for example in relation to the protection of minors, the detection of terrorist content and the identification of consumer scams. 

Greater transparency in content moderation and more options to appeal

Online platforms are a digital space where we express ourselves, showcase our work and connect with friends or customers. This is why it is particularly frustrating when our content gets removed, or the reach of our posts is inexplicably reduced. 

With the DSA, providers of intermediary services, including online platforms, must communicate to their users why they have removed their content, or why access to an account has been restricted, through clear and specific statements of reasons for any content moderation decision they take

To ensure transparency and enable scrutiny over the content moderation decisions made by providers of online platforms as required by the DSA, the Commission launched the DSA Transparency Database. The database is a first-of-its-kind. It provides public access to all anonymised and aggregated statements of reason from the providers of online platforms on their content moderation decisions.

For example, the database shows that in the first half of 2025 alone, platforms have reported more than 9 billion content moderation decisions, of which 99% have been taken proactively on the basis of their own terms and conditions. Only a marginal fraction is related to reports of illegal content.

The DSA also mandates further transparency to platforms on their processes and content moderation practices. Online platforms, such as Snapchat, TikTok, Pinterest, AliExpress, Facebook & Instagram, are offering a greater range of content moderation information. This includes the number of content takedowns, the accuracy of automated systems, the number of user complaints received, and information on content moderation teams.

Under the DSA, users have the right to challenge content moderation decisions. Those who are unsatisfied can choose between complaining to the platform or using an out-of-court dispute settlement mechanism.

Since 2024, users in the EU appealed through platforms’ internal mechanisms over 165 million of VLOPs’ and VLOSEs’ content moderation decisions, which resulted in a reversed decision in almost 30% of the cases. In the first half of 2025 alone, out-of-court settlement bodies reviewed over 1800 disputes in relation to content disseminated in the EU on Facebook, Instagram and TikTok, reversing platforms’ decision in 52% of the closed cases - restoring content and accounts, in a faster and cheaper way than going to court.

More knowledge and choice over what we see – and more control on personalisation options

The DSA obliges providers of online platforms to guarantee greater transparency and control on what we see in our feeds. This allows us to discover on what basis online platforms rank content on our feeds and to decide whether we want to opt out of personalised recommendations, since VLOPs must offer an option to turn off personalised content. TikTok, Facebook and Instagram currently offer an option to disable the personalised feed on their platforms.

Similar obligations apply to ads: in addition to further transparency and control on why we see a certain advertisement on our feed, platforms must label ads and VLOPs must maintain a repository with details on paid advertisement campaigns run on their online interfaces

Member States and the Commission supervise and enforce in close cooperation the provisions related to the transparency of recommender systems and advertisement, including the implementation of ads repositories.

Following proceedings opened by the Commission, the providers of TikTok and of AliExpress committed to provide advertising repositories that ensure full transparency around ads on their respective services.

In cases where it is not possible to come to an agreement on how to ensure that advertising repositories are DSA compliant, the Commission is committed to take action. For instance with the 45 million EUR fine imposed on the provider of X for its non-compliant advertising repository.

Zero tolerance on targeting ads to children and teens and on targeting ads based on sensitive data

The DSA bans targeted advertisement to minors on online platforms. VLOPs have taken steps to comply with these prohibitions. For example, Snapchat, TikTok and Meta’s Instagram and Facebook no longer allow advertisers to show targeted ads to underage users.

Targeted advertisement on online platforms is also prohibited when profiling uses special categories of personal data, such as ethnicity, political views or sexual orientation.

Protection for minors

According to the DSA, online platforms accessible to minors must protect their mental and physical well-being, privacy and security.

In July 2025, the Commission adopted the guidelines for protecting children and a blueprint for an age verification solution. The guidelines address issues such as addictive design, by disabling features such ‘read receipts’ to curb excessive use, and combating cyberbullying by empowering minors to block users and preventing unwanted content downloads. They also aim to mitigate the impact of harmful content by giving minors more control over recommendations and promoting private-by-default accounts to prevent unwanted contact from strangers. 

The guidelines also recommend the use of effective age assurance methods online provided that they are accurate, reliable, robust, non-intrusive, and non-discriminatory. In particular, they recommend age verification measures are put in place for adult content, such as pornography or gambling, or when national laws set a minimum age for social media.

The blueprint for an age verification solution offers a privacy-preserving method for users to confirm their age without disclosing additional personal information. This new standard for online age assurance is currently in a pilot phase. The software will be tested and refined in collaboration with Member States, online platforms, and end-users. A second blueprint was published in October, adding the use of passports and identity cards for onboarding, as well as support for the digital credentials API.

Integrity of elections

The DSA requires VLOPs and VLOSEs to identify, analyse, and mitigate with effective measures risks related to the electoral processes and civic discourse, while ensuring protection of freedom of expression.

 In March 2024, the Commission published guidelines for VLOPs and VLOSEs on the mitigation of risks online for electoral processes.  

Following the guidance, Bing and Google improved the response quality of their AI chatbots and implemented measures to avoid hallucinations, while the provider of Facebook and Instagram committed to better labelling AI-generated deepfakes.

Obligations on traceability of business users in online marketplaces

The DSA introduces obligations for providers of online marketplaces to counter the spread of illegal goods. In particular, such providers must ensure that sellers give verified information on their identity before they can start selling their goods on those online marketplaces. Such providers must also guarantee that users can easily identify the person responsible for the sale.

In June 2025, the Commission made binding a series of wide-ranging commitments offered by the provider of AliExpress to address a number of the Commission’s concerns, improving the traceability of its traders, the transparency of its advertising and recommender systems, and the access to data for researchers. Users of AliExpress can now easily report illegal content they encounter on the platform even if they are not registered with the platform. Offers of ‘adult items’ on AliExpress (e.g. sex toys), previously visible to minors, are now automatically detected and blurred.

If a provider of online marketplace becomes aware of the selling of an illegal product or service by a seller, it must:

  • inform the users who purchased the illegal good or product
  • identify the seller
  • give options for redress

Several providers of online marketplaces have now created an EU-specific recall webpage to notify consumers of any such illegal products or services offered on their platform in the past six months and inform them about any relevant means of redress (including collective redress). Others are sending the recall notifications directly to consumers who bought the illegal products through their services.

These are some of the changes that we, as users, can notice in our everyday digital interactions.

Over these two years, the Commission and national Digital Services Coordinators have actively supervised the application of the DSA across Europe, opening 16 proceedings so far and ensuring tangible results for citizens. For instance, the withdrawal of TikTok Lite’s potentially addictive Reward Programme from the EU and the strengthened consumer safety measures introduced by AliExpress.

These investigations complement the horizontal actions led by the Board of Digital Services, which focus on the protection of minors and on tackling online scams and frauds. There are also national investigations opened by Digital Services Coordinators to ensure compliance with the DSA obligations under their competence. 

Related Content

Big Picture

The Digital Services Act helps to make the online environment safe and trustworthy.

Dig deeper