Algorithmic systems determine many aspects of our online experience, for example, a music streaming app may use algorithms to suggest songs or bands to its users. With the ever-increasing societal impact of online platforms such as social networks, online marketplaces, and search engines, there is an urgent need for public oversight of the processes at the core of their business. This includes in particular how these platforms and search engines moderate content and how they curate information for their users.
The European Centre for Algorithmic Transparency (ECAT) will contribute to a safer, more predictable and trusted online environment for people and business.
How algorithmic systems shape the visibility and promotion of content, and the societal and ethical impact of this, is an area of growing concern. Measures adopted under the Digital Services Act (DSA) call for algorithmic accountability and transparency audits.
The ECAT is part of the European Commission, hosted by the Joint Research Centre (JRC) - the Commission’s in-house science and knowledge service - in close cooperation with the Directorate General Communications Networks, Content and Technology (DG CONNECT). It contributes scientific and technical expertise to the Commission's exclusive supervisory and enforcement role of the systemic obligations on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) provided for under the DSA.
Scientists and experts working at the ECAT will cooperate with industry representatives, academia and civil society organisations to improve our understanding of how algorithms work: they will analyse transparency, assess risks, and propose new transparent approaches and best practices.
What we do
Technical tests on algorithmic systems to better understand their functioning; analysis of transparency reports, risk assessments and independent audits; support to investigations, such as inspections; definition of relevant procedures for ensuring data access to regulators and researchers.
Scientific research and foresight
Identify and measure systemic risks associated to VLOPs and VLOSEs, evaluate risk mitigation measures, propose practical methodologies towards transparent and accountable algorithmic approaches; study the long-term societal impact of algorithms; identify emerging risks associated with the use of VLOPs and VLOSEs.
Networking and community building
At international level, centralise knowledge across the globe, sharing the main scientific findings and best practices for data-driven studies in this area; act as a knowledge hub for research conducted thanks to access to data provided by the DSA.
- Algorithmic transparency
- Recommender systems, information retrieval, search engines
- Fairness, accountability and transparency
- Ethical, economic, legal and social impact
- Risks assessment and risk mitigation measures
- Fundamental rights
Our approach is applied, interdisciplinary and inclusive.
Our mission is to provide technical assistance and practical guidance for transparent and trustworthy algorithmic systems that ensure a safe, predictable and trusted online environment.
We combine methodologies from different disciplines to integrate technical, ethical, economic, legal and environmental perspectives.
We engage with the international community of researchers and practitioners across relevant disciplines, including academia, industry, civil society and public administration.
How we work
The ECAT is managed in close cooperation with DG CONNECT and the Joint Research Centre of the European Commission. In addition to its own staff, the ECAT draws on the knowledge of fellows and external experts. Furthermore, a team of researchers and practitioners from external institutions will contribute to the work of the Centre. An international network of stakeholders (EU Member States, research community, independent auditors, industry representatives and practitioners, civil society) will guarantee a broad-based, collective approach.
Mail: EC ECAT
Algorithimic systems are used to identify, categorise, rank, suggest and present information to users. Those systems increasingly assist, influence and often replace humans in decision-making processes.
The DSA will require VLOPs and VLOSEs operating in the European Union to identify, analyse and assess certain systemic risks stemming from the design and functioning of their service and related systems, including algorithmic systems. Moreover they must commit to addressing identified risks, directly- or indirectly-related to the functioning of the algorithmic system in use.
In addition to the Commission’s supervisory role, such risk assessments and any accompanying mitigation measures will be subject to an external independent audit and oversight by researchers and civil society.