Skip to main content
Shaping Europe’s digital future
Press release | Publication

Online platforms put special focus on elections in the third batch of reports under the Code of Practice on Disinformation

Today, the signatories of the Code of Practice on Disinformation, including major online platforms such as Google, Meta, Microsoft and TikTok, have published the third set of reports detailing the actions they are taking to combat the spread of disinformation online, with a particular focus on the upcoming European elections in June.

Online platforms put special focus on elections in the third batch of reports under the Code of Practice on Disinformation

© Alicja Nowakowska GettyImages

Available on the Transparency Centre, the reports feature the measures that signatories have already implemented or are planning to take ahead and during the European elections, as well as safeguards related to the risks of generative AI.

Věra Jourová, Vice-President for Values and Transparency, said:

The Code proves to be an agile tool, where the whole range of actors, including platforms, civil society, advertising industry, etc. come together to work on the best possible measures to counter disinformation and reduce risks of information manipulation. As platforms outline their ongoing measures, I urge them to intensify efforts in order to be prepared and respond swiftly to foreign information manipulation and disinformation threats during elections.

Thierry Breton, Commissioner for Internal Market, said:

With European citizens soon casting their votes for a new European Parliament, platforms must use every tool at their disposal, from content moderation to labelling deep fakes, to ensure a free and fair information environment. With the DSA electoral guidelines that we present today, we specify what we expect to them to translate those obligations into concrete action. This is not just about compliance; it's about safeguarding the very foundations of our democracies.

In these reports platforms provide their measures to protect the integrity of elections, such as requiring advertisers or creators to clearly label whenever an image, video or audio has been digitally created or altered, cooperating with fact-checking organisations, promoting high-quality and authoritative information to voters, and developing targeted media literacy and pre-bunking campaigns. The first reports were published in February 2023 and September 2023. The next set is expected in autumn 2024.

The Commission foresees to swiftly launch the procedure regarding the recognition of the Code of Practice as a Code of Conduct under the Digital Services Act. The Commission has also adopted the first ever DSA guidelines on elections to designated very large online platforms and search engines.


More on the Code of Practice on Disinformation