The Commission has received annual self-assessment reports from the online platforms and technology companies Google, Facebook, Twitter, Microsoft and Mozilla and from the trade association signatories to the Code of Practice against disinformation, detailing policies, processes and actions undertaken to implement their respective commitments under the Code during its first year of operation.
As foreseen in the December 2018 Action Plan, the Commission is now carrying out its comprehensive assessment of the effectiveness of the Code of Practice.
In addition to the self-assessments by the signatories, the Commission will take into account:
- Input from the European Regulators Group for Audiovisual Media Services (ERGA),
- An evaluation from a third party organisation selected by the signatories to the Code,
- An assessment from an independent consultant engaged by the Commission.
- A report on the 2019 elections to the European Parliament.
On this basis, the Commission will present its comprehensive assessment in early 2020. Should the results under the Code prove unsatisfactory, the Commission may propose further measures, including of a regulatory nature.
Summary of the self-assessment reports
The self-assessment reports indicate comprehensive efforts by the signatories to implement their commitments over the last 12 months. The Code, as a self-regulatory standard, has provided an opportunity for greater transparency into the platforms’ policies on disinformation as well as a framework for structured dialogue to monitor, improve and effectively implement those policies. This represents progress over the situation prevailing before the Code’s entry into force, while further serious steps by individual signatories and the community as a whole are still necessary.
Reported actions taken by the platform signatories vary in terms of speed and scope across the five pillars of the Code as well as across the signatories. In general, actions to empower consumers and the research community lag behind the commitments, which were subject to the Commission’s targeted monitoring phase towards the European Parliament elections in May 2019. The latter concern the disruption of advertising and monetization incentives for purveyors of disinformation, the transparency of political and issue-based advertising, and the integrity of services against inauthentic accounts and behaviours.
Overall, the reporting would benefit from more detailed and qualitative insights in some areas and from further big-picture context, such as trends. In addition, the metrics provided so far are mainly output indicators rather than impact indicators.
The reports from the Code signatories also indicate some intensification of joint efforts between the platforms and other stakeholders, including fact-checkers, researchers, civil society and national authorities. These have aimed at improving the resilience of platforms’ services against various forms of meddling and media manipulation and to diluting the distribution of disinformation. However, the provision of data and search tools to the research community is still episodic and arbitrary and does not respond to the full range of research needs. Moreover, cooperation with fact-checkers across the EU is still sporadic and without a full coverage of all Member States and EU languages. There is a need for further efforts towards a mechanism allowing truly independent organisations to cooperate with the platforms (including via relevant and privacy-compliant access to datasets for research purposes).
Finally, the reports indicate that trade association signatories have raised awareness over the past year and advocated in favour of take-up of the Code among their members.