HomePostsDigital StateEnhancing Legitimacy and Tackling Data Power Through Qualified Transparency in the GDPR

Related Posts

Enhancing Legitimacy and Tackling Data Power Through Qualified Transparency in the GDPR

Reading Time: 3 minutes
Print Friendly, PDF & Email

            The GDPR has not solved the issue of the legitimacy of personal data processing. Enacting GDPR was a measure to legitimize data processing and find a middle ground between rights protection and the free flow of personal data. However, fundamental issues remain unresolved regarding data protection, such as structural privacy erosion and risks to liberal constitutional order caused by data power and a political economy based on personal data processing. Therefore, the question of the legitimacy of personal data processing needs to be reevaluated. The article discusses the shortcomings of the GDPR in tackling data power and bringing legitimacy to personal data processing. It then proposes qualified transparency—an obligation to publish by default data protection impact assessment upon crossing the data power threshold—as a measure to either amend GDPR, dynamically interpret it or as a measure for data power to self-regulate.

Data power and legitimacy in the GDPR

            The GDPR fails to address the issue of data power and the related question of the legitimacy of personal data processing. A few years after the GDPR came into force, the main risks related to data power have not been solved yet, such as long term personal autonomy erosion. Data power is an accumulation of personal and contextual data combined with technological capacity, enabling an impact on behavior without regard to one’s will[1]. The threat lies in e.g. nudging and manipulating people into choices that optimize the objectives of parties with data power. Data power poses a structural challenge to fundamental rights, especially privacy and autonomy, and the liberal constitutional order. That is because data power breaches the beforementioned rights, thus eroding the liberal, democratic order, which is dependent on autonomous and informed citizens able to choose and participate in their political community.

Structural loneliness of a data subject

            Although, in theory, the GDPR aims to empower people by giving them control over their data, the regulation does not deliver sufficient results in practice. That is e.g. because the GDPR puts a data subject alone without necessary information vis a vis controller wielding data power. In effect, data processing is a private law relationship. On the contrary, personal data processing should be more public and transparent to be legitimate and keep the balance between the rights and interests of stakeholders.

            A data subject is endowed with actionable rights against data power controllers, such as rights to information, object, and erasure (Chapter III of the GDPR – Rights of the data subject). However, the risk-based approach logic of self-assessment of a controller and a lack of meaningful transparency precludes individual and social control over how personal data is processed. Information provided based on the right to information (Art. 13 and 14 GDPR) are not enough to review the legitimacy of purposes and interests pursued by a controller. It is also not transparent what risks a controller undertakes by processing data. Limited transparency means that, at the outset, it is a subjective opinion of a controller whether there is a balance between the rights and interests of a data subject and a controller. Henceforth, the data subject has limited information to evaluate risks to his or her privacy and data meaningfully.

More transparency and publicity as an empowerment measure

            That is why there is a need for more transparency and information about personal data processing to address the issue of the legitimacy of data processing and enhance a balance between a data subject, society, and controllers wielding data power. The article argues for the introduction of qualified transparency to achieve this goal. The requirement of qualified transparency should oblige controllers with data power to conduct and disclose their data protection impact assessment (DPIA)(Art.35 GDPR) by default. In addition,  DPIA should be accompanied by arguments justifying processing for particular purposes and pursuing particular interests in a given context. This obligation should be applicable upon crossing a predefined threshold of data power, dependent on the quantity and types of processed data.

            The legitimacy and balance between rights and interests would be improved by default access to meaningful information about personal data processing and arguments put forward to justify the processing by a controller. Default access would enable much more stakeholders to act, evaluate and challenge how a controller assesses risk and justifies the processing, empowering an individual data subject through collective means of action. The public should be able to act jointly on behalf of data subjects. Hence, anyone who can indicate the reasonable probability of being impacted by the data power company should be able to complain or bring a class action to the data protection authority. Further discussed group privacy perspective serves as a theoretical justification for including the broader public in the process.

Conclusion and the road ahead

            Qualified transparency would also provide more input for privacy and data protection discourse. For example, more disclosure would add more information to discuss and judge whether particular purposes and interests for processing in a given context are legitimate and balanced. In this way, qualified transparency would put personal data processing as a public issue, empowering people by lessening the administrative burden of claiming rights and allowing various social actors to act on people’s behalf. Further, more transparency regarding personal data processing would also provide more information for relevant authorities and courts to supervise and decide. In this way, it would benefit data protection discourse.

Jan Czarnocki
Doctoral Researcher and Maria Skłodowska-Curie Fellow at KU Leuven Centre for IT & IP Law

Jan Czarnocki is a Doctoral Researcher and Maria Skłodowska-Curie Fellow at KU Leuven Centre for IT & IP Law. He currently researches privacy and data protection, focusing on biometric and health data in the IoT-AI context within the Privacy Matters (PriMa) ITN project. His research encompasses the intersections of law, philosophy, technology, and policy. Jan is Non-Resident Fellow at Stanford Law School Transatlantic Technology Law Forum and Affiliated Fellow at Jagiellonian University Private Law of Data Project and has been a Visiting Researcher at Julius Maximillians University of Wurzburg Human-Computer Interaction Group. Before, he was a trainee in the External Policies Directorate of the European People’s Party Group in the European Parliament and a “European View” editor-intern in the Wilfried Martens Centre for European Studies. He holds a Master’s degree in law from the University of Warsaw and an LL.M. degree in Comparative Law from the China University of Political Science and Law in Beijing.


Featured Artist