Last year, in anticipation of the CJEU’s judgment on the SCHUFA II case, we examined the key issue in the automated credit scoring. SCHUFA, a German Credit Reference Agency (CRA), provided the loan applicant’s credit score that led to the rejection of the application by the bank. The applicant sought access to the detailed scoring methodology, which SCHUFA refused to provide citing trade secret protection. SCHUFA also argued that the decision to reject the loan rested with the bank. The main legal dispute was whether the CRA’s automated credit scoring constitutes a “decision” under Article 22 of GDPR[1]. If the answer was yes, the loan applicant would be entitled not to be subject to the automated credit scoring and the CRA like SCHUFA must establish a legal basis to lift the restriction.
In our previous article[2], we outlined two possible interpretations of Article 22:
a) Restrictive Approach: We argued that the credit score calculated by SCHUFA could be distinguished from the bank’s ultimate decision to reject a loan application. In this view, calculating a score only constitutes “data processing” rather than “decision-making”. Consequently, the activities conducted by CRAs like SCHUFA would not fall under the scope of Article 22.
b) Broad Approach: Conversely, we acknowledged that the credit score could potentially produce significant legal effects on individuals, such as the automatic refusal of loans, insurance, rent, or electricity supply contracts. In this view, automated credit scoring can already constitute a decision which is prohibited under Article 22, unless CRAs could find a legal basis and comply with relevant information obligations.
A few months after our initial analysis, the CJEU released its judgement[3] on SCHUFA II clarifying the restrictions on automated processing under Article 22. In this article, first we have reviewed the court’s decision, and second we have summarised some considerations for the stakeholders that might be involved in credit scoring activities.
CJEU Judgement Analysis
The court addressed (1) whether the establishment of a credit score by CRAs constitutes an automated decision, and (2) whether a CRA could refuse to disclose its detailed scoring methodology based on commercial secrecy.
- Whether the establishment of a credit score by CRAs is an automated decision
The court definitively said yes to this question, irrespective of the involvement of banks. It criticised a restrictive interpretation of Article 22, arguing that such an approach could circumvent the GDPR’s protection by allowing the establishment of a credit score to be seen merely as a “preparatory act”, with only the bank’s final rejection classified as a “decision”[4].
Instead, the court adopted a broad interpretation. First, it concluded that when the calculation of a credit score turns out to be “the automated establishment of a probability value based on personal data relating to a person and concerning his or her ability to meeting payment commitments in the future”[5], it qualifies as profiling. In this case, SCHUFA gathered information about loan applicants, applied algorithms to analyse this data, and predicted the probability of loan repayment. This processing activity could be regarded as profiling.
Second, this form of profiling constitutes an “automated individual decision-making” where “a third party, to which that probability value is transmitted, draws strongly on that probability value to establish, implement or terminate a contractual relationship with that person”[6]. In this case, the German bank’s rejection of the online loan application was heavily influenced by SCHUFA’s credit score. In addition, the contractual relationship could extend to insurance applications with insurers, rent agreements with landlords, or electricity supply contracts with utility providers. In this view, the court has now ruled that CRA’s automated credit scoring per se can constitute a decision under Article 22.
2. Could a CRA refuse access to detailed scoring methodology due to commercial secrecy
While the court did not explicitly answer this question, its reasoning suggests a negative response. First, the court highlighted the additional information obligations that GDPR imposes on CRAs. These obligations require CRAs to provide applicants with the existence of automated credit scoring, meaningful information about the logic involved in the credit scoring, and the significance and the envisaged consequences of the credit scoring.[7]
Furthermore, the court underscored the applicant’s rights under data protection laws. Applicants are entitled to obtain human intervention, express their views on the automated credit scoring, obtain an explanation of the credit scoring after such assessment, and contest their loan application decision if rejected.[8]
Considering both the CRA’s obligations and the applicant’s access rights, the court implied that “trade secrecy” cannot be used as a legitimate reason to withhold the detailed scoring methodology. However, the court did not clarify the level of detail required for the disclosure of the scoring methodology.
The increasing use of automated systems calls for algorithmic transparency. Yet many of these algorithms could be protected by third-party intellectual property laws. How to balance transparency obligations with intellectual property protection remains a challenging issue.
Data Protection Considerations for Stakeholders
The stakeholders involved in this judgement could be categorised into three main ones: CRAs, lenders, and loan/credit applicants.
The process begins when the applicant completes the lender’s application form where they provide personal information such as name, address, salary, and monthly expenses. The lender, who acts as a data controller, then shares this information with a CRA and requests the applicant’s credit score. Under the lender’s instructions, the CRA is a data processor – it matches the provided information with existing data in its possession from other sources, calculates the applicant’s credit score, and transmits it back to the lender. The lender makes the final decision to reject or approve the loan application. Although not illustrated in the map, it must be noted that if an individual directly requests the credit score from a CRA, the CRA will be designated as a data controller, as they determine the means of data processing.
Building upon this data flow map and the judgement analysis, we summed up below the data protection considerations for each stakeholder:
- Credit Reference Agencies (CRAs)
- Identify a legal basis to issue credit scores: the GDPR generally prohibits profiling, and hence CRAs from processing data to calculate credit scores, unless they can identify a lawful basis for such processing. The three lawful bases include explicit consent, contractual necessity, and legal authorisation.[9]
The GDPR sets a high bar for obtaining explicit consent, particularly in terms of securing the applicant’s affirmative action. In addition, applicants can withdraw their consent at any time, creating extra costs for CRAs to delete personal information and the credit score. Therefore, obtaining explicit consent would be a challenging option for CRAs.
Contractual necessity could be an option. When an applicant asks CRAs directly to calculate their credit scores, they will probably find details of this data processing in the terms and conditions that they signed with CRAs. Automated credit scoring may also be authorised under national law. For example, under English law, Section 145(8) of the UK Consumer Credit Act 1974 allows CRAs to operate if their primary business is providing credit references[10], as specified in Article 89B of The Financial Services and Markets Act 2000 (Regulated Activities) Order 2001. Article 89B defines the provision of credit references as “furnishing of persons with information relevant to the financial standing of individuals or relevant recipients of credit”[11]. These provisions allow CRAs to automatically calculate an individual’s credit worthiness, which can then be supplied to banks, building societies, retailers, or other lenders. Thus, major UK CRAs, such as Equifax, Experian and TransUnion, may rely on this lawful basis.
- Ensure compliance with transparency provisions: The judgement underscores that CRAs are obliged to disclose, to some extent, their scoring methodology. CRAs should adopt a proactive approach to meet this requirement. When directly asked by an applicant to calculate the score, CRAs must inform them about the use of automated credit scoring. If the credit score is requested by lenders, CRAs might check whether lenders has informed applicants appropriately and provided CRA contact details for further inquiries. This notification could be presented on the final page of the online application form. After the application is submitted, if CRAs receive an access request from the applicant, they must respond within the one-month time limit.
- Implement safeguards to mitigate discriminatory effects: The potential for inaccuracies and discrimination based on applicants’ ethnicity, political opinion, religion or sexuality is a concern in any automated processing, including credit scoring. To mitigate these risks, CRAs should employ appropriate mathematical or statistical procedures in their profiling algorithms to minimise errors and discriminatory effects. This can be achieved by designing algorithms that exclude specific sensitive personal data from consideration.
2. Lenders
Lenders do not need to gain consent from applicants before sharing their information with CRAs or requesting a credit score. Lenders can rely on the lawful basis of contractual necessity – the CRA’s role in generating credit score can be seen as necessary to fulfil the credit agreement between the applicant and the lender. The UK ICO supports this position, noting that where the lender relies on an automatically generated credit score from a CRA to decide on a loan, the request of automated credit scoring could be justified as necessary for contract performance. Therefore, lenders must ensure that appropriate data protection clauses are included in their contracts with applicants.
3. Loan/Credit Applicants
This judgement provides valuable insights for applicants regarding their rights:
- Applicants are entitled to request meaningful information about the automated credit scoring, including a breakdown of the methodology used by CRAs;
- Applicants can ask lenders to explain why their application is rejected and contest the decision if they believe it is incorrect or unfair; and
- Applicants have the right to ask lenders or CRAs to rectify any inaccurate information held in their credit reference file.
Conclusions
The CJEU’s judgment in SCHUFA II has clarified how the court would interpret Article 22 of GDPR. The court’s broad interpretation has established that CRA’s automated credit scoring constitutes decision-making, as a third party could draw strongly on this score to determine a contractual relationship. This ruling prompts relevant stakeholders, including CRAs and lenders to seek professional guidance on how to implement data protection considerations derived from this judgement.
While we have provided some advice from each perspective, please feel free to contact us for a more comprehensive analysis tailored to your specific needs.
[1] GDPR – Article 22(1): “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
[2] The SCHUFA II Case and The Widespread Use of Automated Credit Rating Systems. https://www.informationgovernanceservices.com/the-schufa-ii-case-and-the-widespread-use-of-automated-credit-rating-systems/. Accessed 08 August 2024.
[3] Court of Justice of the European Union (CJEU). Judgement of C-634/21-SCHUFA Holding (Scoring). https://curia.europa.eu/juris/document/document.jsf?text=&docid=280426&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=2023696. Accessed 08 August 2024.
[4] CJEU Judgment – Paragraph 61: “there would be a risk of circumventing Article 22 of the GDPR and, consequently, a lacuna in legal protection if a restrictive interpretation of that provision was retained, according to which the establishment of the probability value must only be considered as a preparatory act and only the act adopted by the third party can, where appropriate, be classified as a ‘decision’ within the meaning of Article 22(1) of that regulation”.
[5] CJEU Judgement – Paragraph 73: “Article 22(1) of the GDPR must be interpreted as meaning that the automated establishment, by a credit information agency, of a probability value based on personal data relating to a person and concerning his or her ability to meet payment commitments in the future constitutes ‘automated individual decision-making’ within the meaning of that provision, where a third party, to which that probability value is transmitted, draws strongly on that probability value to establish, implement or terminate a contractual relationship with that person.”
[6] Same as above.
[7] GDPR – Article 13(2)(f), 14(2)(g), and 15(1)(h): “the controller shall … provide the data subject with the following further information necessary to ensure fair and transparent processing: the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.”
[8] GDPR – Recital 71: “… In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision.”
[9] GDPR – Article 22(2): “Paragraph 1 shall not apply if the decision: (a)is necessary for entering into, or performance of, a contract between the data subject and a data controller; (b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or (c) is based on the data subject’s explicit consent.”
[10] The Consumer Credit Act 1974 – Section 145(8): “A person (“P”) operates a credit reference agency if P carries on, by way of business, an activity of the kind specified by article 89B of that Order (providing credit references).”
[11] The Financial Services and Markets Act 2000 (Regulated Activities) Order 2001 –Article 89(B)(1): “Furnishing of persons with information relevant to the financial standing of individuals or relevant recipients of credit is a specified kind of activity if the person has collected the information for that purpose.”