Case C-6344/21 – Request for a preliminary ruling by the Verwaltungsgericht Wiesbaden
Parties: OQ v Land Esse
Joined party: SCHUFA
Advocate General: Priit Pikamäe
Background to the case
If you live in Germany, you are surely familiar with the Schufa credit scoring. Schufa is a private German agency that calculates the creditworthiness of individuals applying for loans, mortgages and even house rentals. In a nutshell, Schufa’s automated system calculates the trustworthiness of a person based on profiling. A certificate is then issued with a score (see an example here). Positive or negative, this is the only information that a person receives. How that score is calculated is not disclosed.
In 2018, the applicant OQ was refused credit by a third party after Schufa provided a negative credit score. OQ then requested Schufa to provide additional information on their personal data stored and to erase incorrect entries. Schufa replied, stating that their score was 85.96% and provided the data subject with a broad outline of the basic functioning of their system but not the methods, covered by commercial secrecy. The data subject then filed a complaint with the German Data Protection Authority (DPA) requesting to order Schufa to provide information about the logic involved, as well as the significance and consequences of the processing. Two years later, the DPA decided not to take action against Schufa. The data subject then appealed the decision in front of the Administrative Court of Wiesbaden.
On October 2021, the Court of Wiesbaden decided to stay the proceeding and refer to the Court of Justice of the European Union (CJEU) two questions on the interpretation of Article 22 of the GDPR. Finally, the right not to be subject to automated decisions is considered for the first time before the CJEU (see the extensive comment on the hearing by Häuselmann 2023).
Is credit scoring an automated decision?
Under Article 22 of the GDPR, data subjects have the right not to be subject to a decision with legal or significant effects based solely on automated processing or profiling. When automated decisions are exceptionally allowed, according to the conditions set in the second paragraph, the data controller shall implement suitable safeguards for the data subject, such as the right to obtain human intervention, to express their point of view and to contest the decision (Article 22(3) GDPR).
In the case at stake, the credit score was calculated based on profiling to evaluate and predict the economic situation and reliability of the data subject (Article 4(4) GDPR). The credit score was then transmitted to another party, i.e. a credit institution, which decided to refuse credit to the applicant. Even if the certificate issued by Schufa indicated only a score without providing any recommendations, the credit scoring institution took that score into account for its decision.
“Multi-stage profiling” is a term proposed by Veale and Binns which aptly describes cases where the decision-making process is segmented into different stages (Binns and Veale 2021). In this case, the score by Schufa based on profiling and, subsequently, the decision by the bank based on such profiling. The legal question is apparently simple but yet complex: is credit scoring, based on profiling, an automated decision within the meaning of Article 22 of the GDPR? If not, what legal protection is afforded to these cases?
The opinion of the referring Court
According to the Court of Wiesbaden, credit scoring is an automated decision. In the request for a preliminary ruling (hereafter ‘request’), the court essentially argues that credit scoring is not just profiling, but an independent automated decision and the fact that a third party takes the final decision does not exclude the applicability of Article 22 of the GDPR (§§ 21-23 request). The court’s argument is based on a factual point of view, in light of the decisive influence of the score on the final decision, and on a legal point of view, taking into account the purposes of Article 22 of the GDPR.
From a factual point of view, the Court considers the significant and decisive influence of credit scoring on the final decision to enter or not a contract with the data subject. A poor score will lead to the refusal of a loan “in almost every case” (§25 request). As reported by Häuselmann, during the hearing before the CJEU, Schufa was asked by the AG what consequences negative scores have for data subjects. While the company did not have specific numbers, they mentioned that “roughly 20% of individuals who receive a negative score will still receive loans”. Clearly, this also means that in 80% of the cases, people with a negative score will not get a loan (Häuselmann 2023). Therefore, the referring court argues that it does not matter who actually takes the decision, whether Schufa or the bank, nor that humans are involved because “it is ultimately the score that actually decides” (§25 request).
From a legal point of view, a restrictive interpretation of Article 22 of the GDPR would not align with the purposes it aims to achieve and would create a lacuna in legal protection. Moreover, it would raise problems of effective enforcement of data subjects’ rights, particularly the right of access and transparency (§§ 26-31 request). While it is true that data subjects have general rights to information and access (Articles 13, 14, 15 GDPR), the specific right to “meaningful information about the logic involved, significance and envisaged consequences” applies only to automated decisions under Article 22(1) of the GDPR. In the case at stake, Schufa is not obliged to provide such information and also refrains from doing so, invoking commercial secrecy. The credit institution is also unable to provide such information because they are not aware of how the scoring is assessed. This gives rise to a lacuna in legal protection, which can, however, be filled if the establishment of the score falls within the scope of Article 22 of the GDPR (§§29-32 request).
The opinion of the AG Pikamäe
In his opinion, delivered on the 16th March 2023, the AG suggests interpreting Article 22 of the GDPR as meaning that:
The automated calculation of a probability rate relating to the ability of a data subject to repay a debt in the future already constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or significantly affects him in a similar way, where that rate, calculated on the basis of personal data relating to the data subject, is transmitted by the controller to a third party controller and, in accordance with established practice, the latter predominantly bases its decision on the conclusion, implementation or termination of a contract with the data subject on that rate.
First, the AG argues that the refusal of credit for individuals has legal effects since the individual cannot longer benefit from a contractual relationship with the financial institution, and significant effects insofar as such refusal affects the financial situation of the person concerned (§35 Opinion). The GDPR also explicitly considers the refusal of a loan application as a typical example of a decision under Article 22 of the GDPR. In this case, the core and novel argument of the AG is the interpretation of the concept of decision, which includes acts that have not only legal consequences but also economic and social impact (§38).
Second, the credit scoring itself can be a decision solely based on automated processing and profiling when: 1) the scoring predetermines the financial institution’s decision to grant or deny credit and 2) there is a margin for human discretion that could verify its result and the correctness of the decision to be taken vis-à-vis the person applying for the credit (§§42 and 44 Opinion). According to the AG, the decisive factor is the impact that the decision has on the data subject. Given that a negative scoring score alone may have a negative impact on the data subject, i.e. significantly restrict them in the exercise of their freedoms or stigmatize them in society, it seems justified to qualify it as a decision when a financial institution attaches fundamental importance to it in the decision-making process (§43 Opinion). In fact, in such circumstances, the activity of the credit agency affects the person applying for credit already at the time of assessing his creditworthiness and not only at the final stage of refusal of credit, where the financial institution merely applies that assessment to the specific case.
Furthermore, the concept of “solely automated” requires that automated processing is the only element justifying the financial institution’s approach towards the credit applicant. The presence of a human could potentially exclude the applicability of Article 22 of the GDPR as long as the internal rules and practices of the financial institution do not leave a “margin of discretion” in the application of the scoring to a credit application (§44). While the AG suggests that this question can best be assessed by the national courts (§40 and 45), the opinion also proposes a solution to the present case.
In light of the information contained in the order for reference, the AG suggests that the scoring by Schufa is a decision within the meaning of Article 22 of the GDPR because it “tends to predetermine the latter’s decision as to whether to grant or refuse credit to the person concerned, so that it must be considered that that position is of a purely formal nature in the context of the process” (§47 Opinion). The AG justifies this conclusion also in light of the objectives pursued by the EU legislator, namely the protection of data subjects’ rights. As suggested by the referring court, a restrictive interpretation of Article 22 of the GDPR would create a gap in legal protection, where data subjects cannot exercise their rights under Articles 15(1)(h), 16 and 17 of the GDPR (§48-50 Opinion). In other words, “the data subject should not have to suffer the adverse consequences of such a delegation of activities” of the financial institution to Schufa (§50 Opinion). Conclusively, the AG refers to Articles 7 and 8 of the Charter and suggests holding the credit rating agency responsible for the calculation of the score since this activity “is ultimately the source of any possible harm” (§51 Opinion).
Finally, in reply to the second question by the referring court, the AG clarifies that Article 6(1) and Article 22 of the GDPR do not preclude national legislation on profiling when it falls outside of the scope of Article 22(1) of the GDPR. However, in that case, the national court must comply with the conditions laid down in Article 6 of the GDPR. In particular, it must rely on an appropriate legal basis, which is for the national court to verify.
Further answers to doctrinal dilemmas
The Opinion of the AG also sheds light on doctrinal questions surrounding the legal protection for automated decision-making in the GDPR. First and foremost, the AG clarifies that the “right not to be subject to decisions based solely on automated processing or profiling” must be interpreted as a prohibition and not as a right that must be invoked by the data subject (§ 31 Opinion).
Moreover, the AG elucidates the content of Article 15(1)(h) of the GDPR on the obligation to provide “information about the logic involved and envisaged consequences”. According to the AG, this information includes disclosing the calculation method for credit scoring unless there are conflicting interests worthy of protection, such as the protection of industrial secrecy. While the AG recognizes that the protection of industrial and business secrets or intellectual property cannot justify a refusal of information tout court, it also deserves protection through a proper balancing of interests. In light of the joint reading of Article 12(1) GDPR and Recitals 58 and 63, the AG concludes that the obligation to provide “meaningful information on the logic used” must be interpreted as requiring sufficiently detailed explanations of the method used to calculate the score and the reasons that led to a particular result (§58 Opinion). Additionally, the principles of transparent information and communication in Article 12 of the GDPR exclude the disclosure of the algorithm, given its complexity.
Real impact, decisive role and human discretion: open questions for the CJEU
The case at stake underscores two crucial problems regarding the applicability of Article 22 of the GDPR in practice. First, when decision-making involves different actors, where all contribute to a significant extent to the final decision. In this case, Schufa, with its credit scoring and the bank, taking the final decision. Second, the crucial weight of a credit score and the opacity surrounding it, as Schufa does not explain why and how the score is calculated. The data subject is a negative score without knowing why.
The judgment, therefore, arises from a legitimate and timely concern from the referring court, i.e. the lack of legal protection for data subjects and transparency on credit scoring. The AG suggests qualifying credit scoring within the scope of Article 22 of the GDPR to fill the gap in legal protection for data subjects. If the CJEU follows this line, the judgment will have huge repercussions on Schufa and other credit agency companies. However, despite its noble aims, this opinion presents some loopholes which, hopefully, the CJEU will consider in their judgment.
The first issue relates to the argument that the final decision is “predominantly based” on credit scoring. The referring court and the AG both focus on the concrete impact of credit scoring in the specific case and its “decisive role” in the final decision. The decisive factor is, therefore, the fact that the scoring “predetermines” the final decision since the financial institution “predominantly bases its decision” on the scoring (§95 Conclusion of the Opinion). Moreover, the lack of human intervention depends on the margin of discretion left based on the internal rules and practices of the financial institution. As the AG suggests, the national court must assess these conditions (decisiveness and discretion) in the specific case.
The main limitation of this argument lies in the “decisive role” of profiling, which requires an assessment in concreto. Decisiveness can hardly be evaluated ex-ante before a decision is taken, as it can only be inferred on a case-by-case basis. The ex-post dimension of the concept of “decisive role” may raise issues in terms of legal certainty for a general prohibition to apply. Moreover, an assessment ex-post also affects the applicability of the obligations to provide information to data subjects on automated decision-making, which, according to Articles 13(2)(f) and 14(2)(g) of the GDPR must be given when the personal data are collected, i.e. before the final decision is taken.
Secondly, what standards should be used to assess decisiveness and discretion? If the human decision-maker considers other factors, would that be enough to exclude the decisive role in the final decision? In this sense, the CJEU must be able to provide clear and precise guidelines to ensure legal certainty and the effective application of the prohibition in Article 22(1) of the GDPR.
A second way that the CJEU could consider is to expand legal protection with an extensive interpretation of Articles 13, 14 and 15 of the GDPR. A possible leeway can be found in their wording: “the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject” (emphasis added). The words “at least in those cases”(with reference to solely automated decisions) could be interpreted by the Court to extend this transparency obligation also to cases where a decision is based on the credit scoring, even though not formally qualifying under Article 22(1) of the GDPR. Even if not formally a decision, credit scoring has an inherently negative impact on the data subject and stigmatizing effects, as rightfully acknowledged in the opinion (§43 Opinion). Increasing transparency when credit scoring is calculated would empower data subjects when it matters the most. Under Articles 13 and 14 of the GDPR, when data are collected and before a decision is taken. Under Article 15 of the GDPR, also after the decision is taken, so that the data subject is able to effectively contest the scoring which formed the basis of the decision.
Conclusively, while the opinion of AG aims to expand legal protection for data subjects, the concepts of “decisive role”, “predominantly based”, and “discretion” require detailed interpretative guidance by the CJEU in order to be effectively applied. To avoid the loopholes of this reading, the Court could consider expanding legal protection through an extensive reading of transparency obligations. At DigiCon, we surely cannot wait to read this landmark judgment which will undoubtedly have profound consequences for the Black Box society we live in.
Co-founder and chief editor of DigiCon
Post-doc researcher at Hertie School working on AI and human rights