HomePostsDigital RightsMore Rights does not Necessarily Mean Better Legal Protection: A Critical Look...

Related Posts

More Rights does not Necessarily Mean Better Legal Protection: A Critical Look at Remedies in European Digital Policy

Reading Time: 5 minutes
Print Friendly, PDF & Email


In the digital era where technologies shape every aspect of our lives, from how we communicate to how we make decisions, the concept of access to justice through effective remedies within the emerging legal framework demands critical scrutiny. In our forthcoming publication “Constitutional Right to an Effective Remedy in the Digital Age: A Perspective from Europe” (available here and here), we undertake a comparative analysis into the emerging remedial designs in the European digital regulation. In our paper, we take the right to an effective remedy as the lens to assess the emerging remedial designs under three key European digital regulations, the General Data Protection Regulation (GDPR), the Digital Services Act (DSA) and the Artificial Intelligence Act (AI Act). Our contribution aims to start the conversation on how institutional collaboration ought to function in order to ensure respect for the constitutional right to an effective remedy. Our findings lead to open a crucial debate about the right to effective remedy in Europe. Despite the stated aim in the EU’s digital policy to protect citizens in the digital sphere, we find that the European approach is diluted by fragmentation and ambiguity in the new remedial procedures that the regulations establish. This fragmentation leads to a complex justice landscape that can deter individuals from seeking redress, underscoring the need for a more harmonised approach. 

Bringing Remedies into the Constitutional Spotlight

At the heart of the European Union’s legal framework lies the latin principle ubi ius ibi remedium, meaning that every individual should have the right to an effective remedy for any violations of their rights and freedoms. This fundamental right, guaranteed by Article 47 of the Charter of Fundamental Rights of the European Union (CFR), should also apply in the digital age, thus guiding the protection of fundamental rights. Digital relationships, characterised by their complexity and transnational nature, trigger the need for a re-evaluation of the meaning of an effective remedy in the European digital policy that pursues novel avenues for access to justice for individuals whose rights are affected by the uses of technologies such as artificial intelligence. 

The European Union has been at the forefront of adapting its legislative framework to the digital age, with landmark regulations such as the GDPR, the DSA and the AI Act. Our analysis reveals a fragmented legislative landscape, leading to a complex patchwork of remedies for those seeking justice. Specifically, in our analysis we assess the emerging remedies based on three main types: internal complaint mechanisms, independent supervision, and judicial remedies. Each of these types of remedies will play a different role in the digital ecosystem, yet they all face a common challenge of fragmentation that will require effective institutional collaboration among the different (supervisory) stakeholders. 

Internal complaint mechanisms provide a direct and seemingly more accessible avenue for individuals to address their grievances in the digital age. The main aim of these mechanisms is to empower users to raise concerns within the platform or service directly, without having to engage with external regulatory or judicial bodies that may be more lengthy and costly in nature. In the realm of data, the GDPR reinforces data subjects’ rights and access to remedies, primarily through mechanisms like the right to access and erasure, although it lacks direct internal complaint mechanisms. Moreover, the DSA expands remedies in content moderation by introducing internal complaint-handling systems aiming to protect users’ fundamental rights (see e.g. here). Online platforms must handle complaints diligently and non-arbitrarily, with limitations on automated decision-making. Meanwhile, the Artificial Intelligence (AI) Act focuses on internal accountability, requiring clear explanations for AI-driven decisions affecting individuals’ rights (see e.g. here), yet the implementation of this right remains ambiguous, especially concerning its interaction with GDPR provisions (see here). 

Independent supervision involves oversight by independent supervisory bodies. With the requirement of independence and impartiality akin to the judicial bodies, coupled with the technical and subject-specific expertise, independent supervisory authorities play a key role in enforcing compliance and protection of fundamental rights in the digital age. Administrative remedies, exemplified by the GDPR’s right to lodge complaints with data protection authorities, offer crucial avenues for individuals to address violations of their rights, albeit with limitations, such as varying capacities of these authorities across Member States (see here). The DSA further extends remedies by allowing users to lodge complaints with Digital Service Coordinators and access out-of-court dispute resolution mechanisms, although challenges arise from the non-binding nature of decisions and online platforms’ discretion in engaging with certified bodies. Conversely, the AI Act reaffirms existing administrative and judicial remedies for AI-related grievances but introduces administrative complaint mechanisms through national market surveillance authorities, raising concerns about potential fragmentation and confusion in oversight responsibilities, particularly regarding data protection. 

Judicial remedies represent the cornerstone in the legal recourse to justice, by offering a pathway to seek redress through the courts. Lying at the essence of the right to an effective remedy protected under Article 47 of the Charter of Fundamental Rights of the EU, the courts are bound to deliver legally binding decisions, providing a formal avenue for compensation or other forms of legal redress to individuals whose rights are affected by the digital technologies. The GDPR provides data subjects with a two-fold avenue for seeking judicial redress, emphasising a rights-based approach to ensure a high level of protection for fundamental rights related to privacy and data protection (see here). Additionally, the GDPR grants collective rights of access to court, enabling not-for-profit bodies to bring complaints on behalf of data subjects, a significant addition to the remedial architecture particularly relevant in the digital age. Similarly, the DSA introduces rights for users to access judicial remedies. Conversely, the final agreement on the Artificial Intelligence (AI) Act does not explicitly enshrine a right to seek judicial remedies against the uses of AI systems, leaving individuals to rely on existing Union law for such recourse. However, broader effects of AI systems on health, safety, or other interests may necessitate alternative mechanisms for seeking redress, likely involving directives such as Product Liability and AI Liability. 

Fostering the Right to an Effective Remedy in the Digital Age

Adding more rights and remedies within the legislative set up does not fix the problem. On the contrary, such a legislative set up seems to lead to an unprecedented fragmentation in the access to justice, which risks eroding the essence of the right to an effective remedy. The intertwining of various legal instruments creates a labyrinth that challenges individuals’ ability to seek justice. In our contribution, we therefore want to place this set up under scrutiny in the pursuit of a coherent approach to streamline digital remedies and thus enhance their accessibility and effectiveness.

Traditional ex-post remedies find limits within the context of the digital age. Recent CJEU’s rulings underscore the importance of transparency in automated systems, emphasising the necessity of individuals having access to information regarding the criteria used in assessments. Additionally, the court highlights the significance of prior review of these criteria before implementation by independent supervisory authorities (see also). However, limitations persist, including territorial jurisdiction constraints, limited scope of judicial review regarding technical standards, and practical obstacles such as time and cost constraints in pursuing remedies. These challenges highlight the complexities of navigating the legal landscape in safeguarding individuals’ rights effectively.

Clarity in the interplay of transparency requirements across emerging digital legal frameworks, such as the DSA, AI Act, and GDPR, is crucial for effective access to remedies. For instance, debates surrounding the right to explanation under GDPR and AI Act raise questions about procedural and substantive clarity. These disparities may lead to inconsistent application of rules, jeopardising the legal protection afforded to individuals. As the digital landscape evolves, efforts to foster institutional collaboration and streamline enforcement mechanisms become imperative to address these challenges effectively.

Fostering institutional collaboration among supervisory authorities at both national and EU levels is essential for ensuring effective access to remedies while preserving legal certainty. However, this requires navigating complexities such as jurisdictional overlaps and varying enforcement capacities among Member States. While centralisation may offer benefits in harmonising enforcement, it must be approached cautiously to avoid undermining national identities and the principle of subsidiarity. Ultimately, striking a balance between European coordination and respecting national nuances is crucial for advancing digital governance and safeguarding individuals’ rights in the algorithmic society.

Finding the right approach to feasible supervision of the European digital policy requires a multifaceted approach, aiming at enhancing the clarity and institutional collaboration in delivering remedies in the digital context. A clear, coordinated procedural approach is therefore crucial not only for strengthening fundamental rights protection but also for empowering individuals to navigate the remedial landscape with confidence. Our findings advocate for ongoing regulatory evaluation and adaptation to keep pace with the digital evolution.

+ posts

PLMJ Chair at Católica Global School of Law.

Simona Demkova
Assistant Professor of European law at Leiden University | Website | + posts

Simona Demková is an Assistant professor of European law at Leiden University (the Netherlands). Her research focuses on European public law, fundamental rights protection and law and technology. Previously she worked as a postdoctoral researcher at the University of Luxembourg within the framework of interdisciplinary (law and computer science) project – DILLAN (Digitalisation, Law and Innovation). She completed her PhD thesis titled ‘Effective Review in the Age of Information: The Case-study of Semi-automated Decision-making based on Schengen Information System’ at the University of Luxembourg under the supervision of Prof. Herwig Hofmann. You can find her publications here: https://www.universiteitleiden.nl/en/staffmembers/simona-demkova/publications#tab-3


Featured Artist