The SCHUFA II case and the widespread use of automated credit rating systems

You spend fifteen minutes completing an online credit application and start to plan new purchases with the help of this loan. Ding! An email notification sound from your phone interrupts your planning only ten seconds after you submitted your application. Unfortunately, the email contains an automatic rejection letter from the bank. Some people in this situation might just delete the email and change to another bank. However, a German applicant with a similar experience chose an alternative way to handle this rejection – she took the credit-rating agency which provided the bank with her credit score to the Court. On 26th January 2023, this case was heard before the Court of Justice of the European Court (CJEU). As the first-ever CJEU case on whether the profiling for credit scoring would constitute a solely automated decision, this case will examine the threshold of the GDPR’s rules on automated decision-making, and possibly restrict the widespread use of the credit rating systems that use this technology. In this article, we will explore how Article 22 of the GDPR[1] could potentially be applied by the CJEU in the context of automated credit rating when the judgement is ultimately drafted and released. 

Case Summary[2]

In the aforementioned case, the Claimant applied for a loan with a bank in Germany. The bank rejected her application based on her credit rating, which was provided by a German credit-rating company called SCHUFA. The Claimant accepted the refusal from the bank, but submitted requests for access to the rating methodology and deletion of her data to SCHUFA. SCHUFA declined to provide the Claimant with a detailed breakdown of its scoring methodology, claiming that this information constituted a commercial secret. As a result, the Claimant reported SCHUFA to the Hessian Data Protection Authority (German DPA). Initially, the DPA took sides with SCHUFA, and held that there was nothing to suggest that SCHUFA had not met the requirements of Section 31 of the BDSG, which is also known as the Federal Data Protection Act – the German implementation of the GDPR. Unsatisfied with this decision, the Claimant took SCHUFA and the DPA to the Wiesbaden Administrative Court, which referred the case to the CJEU. CJEU will deliberate whether automated credit rating is subject to Article 22 of the GDPR. 

Anatomy of Article 22 of the GDPR

Article 22 of the GDPR concerns the rights of data subjects regarding automated decision-making. Article 22(1) puts a general prohibition on solely automated decision-making with legal significance, as it states that an individual has the right not to be subject to “a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” To fall within the scope of this provision, a decision must (1) be made by automated means, and (2) produce legal or similarly significant effects on the individual.

The first condition, in other words, implies the ability to make decisions by technological means without human involvement. The human involvement has to be interpreted in an active way instead of a token gesture. It should be carried out by someone who is in a position to independently assess the automated outputs, consider all the relevant factors to make the analysis, and most importantly has the authority to overturn the decision. On the contrary, if an automated system makes a decision and the decision is automatically delivered to the individual without any prior meaningful assessment by a human, that will be categorised as solely automated decision-making. Simply speaking, the core issue is whether a controller just applies the decision made by the automated system, or before the decision is applied they evaluate the decision and have the competence to change it.

Secondly, the decision must produce a legal effect that affects the individual’s legal status or legal rights, or must have a similarly significant impact on the individual’s circumstances, behaviour or choices, such as automatic refusal of an online credit application[3].

If a decision satisfies these two conditions of “technological independence” and “legal significance”, then it would constitute “automated decision-making” and fall within the scope of Article 22.

Importantly, the general prohibition on automated decision-making is subject to the exceptions listed in Article 22(2). They include the necessity to enter into or perform a contract between the individual and a controller, authorisation by national law, as well as the individual’s explicit consent. However, safeguards must be taken to protect the individual in the context of these exceptions under Article 22(3) and 22(4).

Whether Credit-Rating Companies are Subject to Article 22?

In our case, a credit-rating company profiles the Claimant and assigns her a credit score, and then based on this score a bank rejects her loan. Undoubtedly, by drawing strongly on that credit scoring, the bank makes a significant solely-automated decision. However, is this credit-rating company SCHUFA making one too? In other words, does SCHUFA’s automated credit rating constitute automated decision-making? If it does, then could SCHUFA rely on one of the exceptions? And if yes, what kind of specific safeguards do they need to put in place? To answer these questions, we need to explore how automated credit rating is carried out and how loan applications are assessed.

Automated credit rating is normally conducted by a credit-rating company via profiling. The company collects necessary personal data from the online credit application, uses algorithms to analyse these data to identify correlation, and finally applies the correlation to an individual to identify the characteristic of the present or future behaviour.[4] Simply speaking, profiling means that the credit-rating company gathers information about the loan applicants and evaluate their characteristics or behaviour patterns in order to place them into a certain category or group, in particular to evaluate their financial capacities. This three-stage profiling process applies to this case. SCHUFA scored the Claimant’s creditworthiness by predicting the likelihood of paying back a loan by the Claimant. SCHUFA then sent this credit score to the bank, and the bank used this score to decide whether to provide the Claimant with a loan or not.

A loan application process is therefore comprised of three stages:

  • the 1st stage is where the applicant completes the application form,
  • the 2nd stage of is where the credit-rating company conducts automated credit rating by means of profiling to calculate the credit score, and
  • the 3rd stage of decision making where the bank rejects or accepts the loan application.

A restrictive interpretation could potentially be adopted by the CJEU based on the profound differentiation made by the GDPR between “processing, including profiling” on the one hand and a “decision based on the processing” on the other hand. Accordingly, SCHUFA’s credit scoring via automated profiling in the 2nd stage could potentially be distinguished from the rejection decision made by the bank in the 3rd stage. As a credit information agency, SCHUFA could argue that determining a score involves “processing”, but not a “decision” within the meaning of Article 22(1). If this argument is adopted by the CJEU, credit rating companies operating under this model would not be regulated by Article 22, and could continue to say: “We are just providing scores, while the banks make the decision”.

However, from a factual perspective, it needs to be highlighted that this credit score played a decisive role in the bank’s evaluation of loan applications. As indicated by the Reference paper submitted by the Administrative Court of Wiesbaden to the CJEU, “it is ultimately the score established by the credit information agency on the basis of automated processing that actually decides whether and how the third-party controller enters into a contract with the data subject; although the third-party controller does not have to make his or her decision dependent solely on the score, he or she usually does so to a significant extent.[5]” Hence, the strong causal link between stage 2 and stage 3 could potentially encourage the CJEU to regard the credit-rating company as an automated decision-maker, subjecting them to Article 22. The company would be required in this case to firstly confirm that its processing of data falls within the scope of one of the legally recognised exceptions to the prohibition of automated decision-making, and secondly establish appropriate safeguards to protect the individual.

One of the applicable exceptions is where this decision is necessary to perform or enter into a contract between the individual and a controller. As recognised by the ICO, the reference to “a” controller instead of “the” controller implies that the decision-making could potentially be carried out by a controller other than the one who is a party to the contract with the individual.[6] In this case, even though the contract is between the individual and the bank, rather than the credit-rating agency, the agency’s decision could still be covered by Article 22(2)(a) as long as it can be shown that it is necessary to fulfil the contract between the applicant and the bank, which would not be an easy task. Alternatively, the credit-rating agency may also rely on an exception under national law, such as the German BDSG, where (1)the automated crediting rating is for the purpose of deciding on the creation, execution or termination of a contract with the applicant, (2)the data are essential for calculating the score, and (3)the rating is a scientifically recognised mathematic-statistical procedure.[7] Under both exceptions, however, credit-rating companies would have to allow individuals to obtain human intervention, express their views on the automated decision, and contest the decision.[8] Additionally, at the time when they obtained data from the applicants, they would have to provide individuals with “meaningful information about the logic involved” in the automated decision, as well as information about “the significance and the envisaged consequences” of the decision via a privacy notice.[9] 

Conclusion

The final decision will depend on which approach the CJEU adopts to interpret Article 22(1) of the GDPR. If a restrictive interpretation is taken, the automated processing via profiling conducted by the credit-rating companies will be distinguished from the bank’s rejection decision, the Claimant will lose the case, and the credit-rating companies could continue to decline the individual’s request to obtain the breakdown of their scoring methodology. Alternatively, given the decisive role the credit score plays in the bank’s decision, the CJEU may decide that automated credit rating itself constitutes automated decision-making. These companies could no longer refuse to provide information by stating that the bank makes the decision and the scoring methodology is commercial secrecy. Failing to comply with the disclosure and transparency obligation, they would be banned from engaging in the business of automated credit rating, potentially leading to the banks’ termination of this business model in the process of loan applications. Whatever the approach adopted by the CJEU, their decision will remove the ambiguity concerning the nature of automated credit rating within the meaning of Article 22, and further facilitate the applicability of the GDPR to real life. 


[1] GDPR – Article 22 “1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. 2. Paragraph 1 shall not apply if the decision: (a)is necessary for entering into, or performance of, a contract between the data subject and a data controller; (b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or (c) is based on the data subject’s explicit consent. 3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision. 4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.

[2] Administrative Court of Wiesbaden (Germany). Case C-634/21 Request for a preliminary ruling. https://curia.europa.eu/juris/documents.jsf?num=C-634/2. Accessed 01 Februrary 2023.

[3] GDPR – 1st Paragraph of Recital 71 “ The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention.”

[4] The WP29 Guidelines on Automated Individual Decision-Making and Profiling for the purposes of Regulation 2016/679. https://ec.europa.eu/newsroom/article29/redirection/document/49826. Accessed 01 February 2023.

[5] See above 4. The WP29 Guidelines on Automated Individual Decision-Making and Profiling for the purposes of Regulation 2016/679.

[6] Information Commissioner’s Officer. Guidance on Automated Decision-making and Profiling. https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling/when-can-we-carry-out-this-type-of-processing/. Accessed 01 February 2023.

[7] German BDSG – Section 31(1) “ For the purpose of deciding on the creation, execution or termination of a contractual relationship with a natural person, the use of a probability value for certain future action by this person (scoring) shall be permitted only if (1) the provisions of data protection law have been followed; (2) the data used to calculate the probability value are demonstrably essential for calculating the probability of the action on the basis of a scientifically recognized mathematic-statistical procedure; (3) other data in addition to address data are used to calculated the probability value; and (4) if address data are used, the data subject was notified ahead of time of the planned use of these data; this notification shall be documented.”

[8] GDPR – 4th Paragraph of Recital 71 “ In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision.”

[9] GDPR – Article 13(2)(f) “ the controller shall, at the time when personal data are obtained, provide the data subject with the following further information necessary to ensure fair and transparent processing: the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.”

Share:

More Posts

Send Us A Message