HomePostsDigital RightsDisinformation: Multistakeholder transparency and the rule of law 

Related Posts

Disinformation: Multistakeholder transparency and the rule of law 

Reading Time: 6 minutes
Print Friendly, PDF & Email

Disinformation, fundamental rights and regulatory approaches

Disinformation, as it is often said, is not a new phenomenon. What changed in the last years is the speed, the reach and the means. As a consequence, it manages to reach very quickly many citizens. 

Tackling disinformation is not straightforward and it is more and more clear that it is not a one silver bullet solution. As such disinformation is not illegal and any response needs to respect fundamental rights. 

As happens also in other sectors, when issues move to the digital sphere, this entails a new set of challenges, including in the policy and regulatory sphere. Disinformation is no exception. During the last years, the response to disinformation during elections in various democratic countries and during international crises such as the Covid-19 pandemic and the war in Ukraine, posed, among others, constitutional challenges. Different laws and regulations were adopted in various national contexts, including by some EU members states, contributing to bringing complexity to a phenomenon which is global, but which is at the same time keeping characteristics which are very local (e.g. language, media diet, historical background of a country, education, broadband penetration, etc.). In this whole context, the European Union seems to be breaking new grounds with an approach which started with self-regulation (as the first natural step) and which is moving towards co-regulation. The first Code of Practice on disinformation was adopted by its signatories (mainly online platforms) in 2018. It was the first of its kind and other countries (e.g. Australia) have followed a similar path.  As a result of an assessment done one year later, the Code has proven to be a strong tool for cooperation and dialogue and has nudged the signatories towards the adoption of new policies on disinformation. However, it also emerged that there was still margin for improvement such as commonly-shared definitions, data granularity, transparent KPIs, access to data and monitoring. Inclusion of additional signatories, including from the advertising sector, was also encouraged.  

This led to the signature in June 2022 of the strengthened Code of Practice, of which the main points are: Demonetisation; Transparency of political advertising; Ensuring the integrity of services; Empowering users; Empowering researchers; Empowering the fact-checking community; Transparency centre and Task-force; Strengthened Monitoring framework. Signatories of this new code also include advertisers, civil society organisations and fact-checkers. In parallel, the Regulation for a Digital Services Act was adopted. With the latter there is a move from self-regulation to co-regulation, with the Code of Practice that could serve as Code of Conduct under Article 35. 

When dealing with the online sphere, the regulatory challenge is to keep the pace of technologies which are evolving very quickly and certainly faster than the legislative machinery. In domains with a high rate of technological developments as digital media, it is increasingly difficult to create top-down public regulation that is sufficiently effective and future-proof. At the same time the need to ensure effectiveness and flexibility to ensure long-term benefits, as well as trust and credibility with citizens and stakeholders and preventing capture (de Cock Buning and Senden 2020). Any regulatory intervention needs to guarantee full respect of fundamental rights and media diversity and pluralism. 

To address this challenge of having a future proof evidence-based policy approach, dialogue and coordination among actors involved is fundamental. The Code of Practice, the DSA, the proposal of the European Media Freedom Act as well as initiatives such as the European Digital Media Observatory (EDMO) and strengthening the role of ERGA all go in that direction. However, we suggest investing more in the dialogue with engineers and developers, which would ease the creation of products that respect fundamental rights, rule of law and democratic principles by design. In particular, engineers and developers would benefit from having the policy knowledge, as this would allow limiting ex ante the risk that the services are exploited to cause harm to democracy. On the other hand, by being in a structured dialogue with engineers, policy makers would gain some time advantage in identifying the regulatory challenges new services and products may trigger. 

Transparency and accountability in the multi-stakeholder approach

The growing recognition of the importance of a multi-stakeholder approach to tackle disinformation is evident in this year’s Strengthened Code of Practice. The Strengthened Code goes a long way towards addressing the criticisms leveled against the previous one, by providing indicators to assess progress and compliance, and the final but perhaps the most important commitment by VLOPs – seeking alignment with the DSA – to be audited at their own expense by independent auditors (See commitment 44). The Strengthened Code’s commitment to a multi-stakeholder approach is evident not just in the diverse composition of the signatories, but in the extensive inclusion of diverse stakeholders as platform allies in the implementation of the code itself. While consumers and the research community already featured in the first code as stakeholders to be ‘empowered’, such commitments remained vague and did not fully reflect the complexity of responding to disinformation online and the actors involved in it. The new Code’s extensive commitments to empowering users, the research community and the fact-checking community go a long way towards spelling out the next steps in a truly multi-dimensional approach to disinformation. At the same time, with the inclusion of a growing number of stakeholders as partners in the fight against disinformation, comes an increased need for transparency by all actors involved to avoid diluting accountability through collaborative efforts.  

The need for increased transparency is clearly recognised in the establishment of a transparency centre envisioned under the Code. Not only will signatories have to report on various aspects of their policies including publishing ‘the main parameters of their recommender systems’, they also commit to make public in the Transparency Centre the terms of service and policies that their service applies to implement each Commitment and Measure that they subscribe to. At the same time, the inclusion of a wide range of stakeholders in the commitments of the Code, brings the need for a fresh look at transparency in a multistakeholder approach to disinformation. In recognising their key role in the fight against disinformation, the new Code dedicates a whole chapter to empowering the fact-checking community. Relevant signatories have committed to, inter alia, facilitate, across all Member States languages in which their services are provided, user access to tools for assessing the factual accuracy of sources through fact-checks from fact-checking organisations that have flagged potential Disinformation, as well as warning labels from other authoritative sources. They have also committed to establishing a framework for transparent, structured, open, financially sustainable, and non-discriminatory cooperation between them and the EU fact-checking community regarding resources and support made available to fact-checkers. Similarly, signatories have committed to make it possible for users of their services to access indicators of trustworthiness (such as trust marks focused on the integrity of the source and the methodology behind such indicators) developed by independent third-parties, in collaboration with the news media, including associations of journalists and media freedom organisations, as well as factcheckers and other relevant entities, that can support users in making informed choices. The research community also features more prominently in the new Code as a key player in ensuring public accountability and understanding of disinformation and the policies taken to counter it. 

Implementation of the Code of Practice on Disinformation and the rule of law

With the Strengthened Code’s commitments being spelled out in much greater detail in the 48 page document, there is no doubt that the question of what the implementation will look like in practice dooms large, and the devil may well be in the detail.  As different stakeholders are included into the details of the implementation of the current regulatory efforts to tackle disinformation, increased transparency of all players involved remain essential to assess not only the effectiveness of the Code but its validity from a constitutional perspective in living up to the principles of the rule of law and taking account of its potential impact on fundamental rights. As Suzor noted in 2018 “The values of the rule of law—values of good governance—provide a way to conceptualize governance by platforms in constitutional terms. At a minimum, for a system of governance to be legitimate, decisions must be made according to a set of clear and well-understood rules, in ways that are equal and consistent, with fair opportunities for due process and independent review.” If this particular form of self/co regulation becomes too opaque to monitor, in practice this will be a challenge for rule of law and the EU’s own particular approach to digital constitutionalism on this side of the Atlantic (De Gregorio 2022). Similarly definitional clarity by all actors involved constitutes a key first step in this direction, with key terminology still to be defined in the context of a common understanding of manipulative behaviour, indicators of trustworthiness and labels to be applied by fact-checkers. The Transparency Centre and the Task-force will no doubt provide key instruments to provide legality to the public and to review, adapt and improve the reporting and monitoring framework in order to meet the EU’s own high rule of law standards. 

Finally, online platforms moderate content on the basis of their terms of service, to which the users subscribe when creating their accounts and as such transparency and clear definitions are fundamental. This creates a second legal level (in addition to the regulatory one) for citizens and might create a constitutional challenge if fundamental values and principles are not the same. 

Paula Gori
Secretary-General at EDMO

Paula Gori is the Secretary-General and Coordinator of the European Digital Media Observatory (EDMO). She joined the School of Transnational Governance at the European University Institute in 2017 where she is a member of the management team. Prior she was the Coordinator of the Florence School of Regulation – Communications and Media, which offers training, policy and research activities on electronic communications regulation and competition and she collaborated with the Centre for Media Pluralism and Media Freedom, which she coordinated during the initial set-up phase back in 2012. She was for several years the Scientific Coordinator of the Annual Conference on Postal and Delivery Economics and she is one of the authors of the report for the European Commission on European Union competences in respect of media pluralism and media freedom. Paula has a legal background and is a qualified civil mediator and she is evaluator of EU funded projects.

Lisa Ginsborg
Research Fellow at School of Transnational Governance

Lisa Ginsborg holds a Ph.D. in Public International Law from the European University Institute (EUI) in Florence, Italy. She has previously worked as a Post-Doctoral Researcher at University College Dublin, as a Teaching Fellow at the European Inter-University Centre for Human Rights and Democratisation (EIUC) in Venice, and was a Visiting Researcher at New York University (NYU) School of Law, and at the Sydney Centre for International Law at the University of Sydney.


Featured Artist