Skip to main content
Shaping Europe’s digital future
  • PRESS RELEASE
  • Publication 10 October 2025

Commission scrutinises safeguards for minors on Snapchat, YouTube, Apple App Store and Google Play under the Digital Services Act

The European Commission initiated the first investigative actions following the Guidelines on Protection of Minors under the Digital Services Act (DSA).

Text "Digital Services Act" inside a white triangle inside a white triangle against a blue background.

The Commission is requesting Snapchat, YouTube, Apple and Google to provide information on their age verification systems, as well as on how they prevent minors from accessing illegal products, including drugs or vapes, or harmful material, such as content promoting eating disorders.

Executive Vice-President for Tech Sovereignty, Henna Virkkunen, said:

We will do what it takes to ensure the physical and mental well-being of children and teens online. It starts with online platforms. Platforms have the obligation to ensure minors are safe on their services – be it through measures included in the guidelines on protection of minors, or equally efficient measures of their own choosing. Today, alongside national authorities in the Member States, we are assessing whether the measures taken so far by the platforms are indeed protecting children.

The Commission is requesting Snapchat to provide information about how it prevents children under 13 years of age from accessing its services, as prohibited by the platform’s own terms of service. The Commission is also requesting Snapchat to provide information on the features it has in place to prevent the sale of illegal goods for children, such as vapes or drugs.

With regards to YouTube, in addition to information on its age assurance system, the Commission is seeking more details on its recommender system, following reporting of harmful content being disseminated to minors.

For Apple App Store and Google Play, the Commission is seeking information on how they manage the risk of users, including minors, being able to download illegal or otherwise harmful apps, including gambling apps and tools to create non-consensual sexualised content, the so-called ‘nudify apps’. The Commission is also seeking to understand how the two app stores apply apps' age ratings. 

To ensure effective enforcement of the Guidelines on protection of minors across all platforms, large and small, the Commission is taking further actions with the national authorities to identify platforms posing the greatest risk for children. 

Find out more about the:

Guidelines on Protection of Minors under the Digital Services Act

Supervision of the designated very large online platforms and search engines under DSA

Related content