Skip to main content
Shaping Europe’s digital future
Press release | Publication

Disinformation: signatories report on their actions to fight COVID-19 disinformation

The Commission has published the last set of reports of the COVID-19 monitoring programme, providing insight about actions taken by platforms, signatories of the Code of Practice on Disinformation, to limit disinformation on the pandemic.

slide showing the principles of a Code of Practice for online platforms and social networks

European Commission

The learnings drawn from this program are also feeding into a more robust reporting framework of the strengthened Code of Practice, which is expected to be presented by the signatories in the coming weeks building on the Guidance of the Commission published last May.

Věra Jourová, Vice-President for Values and Transparency, said:

Disinformation related to the coronavirus crisis and Russia's war in Ukraine clearly show that we need stronger tools to fight online disinformation. This exercise proved useful and improved transparency, but it also had its weaknesses. The lessons drawn from it will feed into the strengthened anti-disinformation Code that will offer much more robust and predictable framework to address disinformation.

Thierry Breton, Commissioner for Internal Market, added:

With the upcoming Code of Practice, we expect a solid framework to be swiftly operational in addressing the fight against disinformation. Together with the Digital Services Act they will help ensure platforms live up to their responsibility to limit the spread of such harmful content.

 The monitoring of the platforms' measures to limit disinformation will continue under the revamped code. TikTok reported that, in March and April, the number of coronavirus-related videos violating its policies is declining by 2,239 compared to the previous reporting period along with the lifting of restrictions in many EU countries. The number of videos containing medical misinformation or with a COVID-19 tag removed for violations decreased by 10,131 compared to January and February. Meta also reports that, in March and April, a considerably lower number of coronavirus-related pieces of content have been removed from Facebook and Instagram for violations of its misinformation policies: 124,000 from Facebook and 14,000 from Instagram compared to January and February. Microsoft and Google continue to prevent the publication of several millions of ad placements, including vaccine-related content, in accordance to their misleading content policies. Compared to the previous reporting period, Twitter flagged a smaller amount of accounts suspended for violation of their Misleading Information Policy.

Last set of reports

Code of Practice on Disinformation