Wednesday, 5 March 2025

Artificial intelligence in financial services - new report by Finance Watch

Today, Finance Watch, a non-profit association dedicated to reforming finance in the interest of European citizens, published a new report: 'Artificial intelligence in finance: how to trust a black box?' authored by its Chief Economist Thierry Philipponnat.

As AI-powered systems increasingly drive financial decision-making in areas such as creditworthiness assessments, insurance pricing and investment products, the report asserts that the core principles of financial regulation accountability, responsibility, and transparency are being tested.

Against this backdrop, the report identifies several critical concerns: 

  • Lack of transparency: AI models operate as “black boxes”, generating outputs without clear explanations of their reasoning, making human oversight and intervention impossible.
  • Consumer protection under threat: In retail finance, the deployment of AI could lead to opaque creditworthiness assessments (see for an example here), pricing discrimination, discriminatory lending, and misleading financial advice. 
  • Supervisors face AI challenges: Supervisors tasked with enforcing regulation face challenges in keeping pace with financial institutions' deployment of AI and delivering on their mandates.
  • Market stability is at risk: Increasingly dependent on third-party AI providers, financial institutions face operational risks from unregulated external systems and concentration risks, where a handful of dominant AI firms control critical models and infrastructure, creating systemic vulnerabilities. 

As a response, the report urges a reassessment of the financial regulation framework: 

  1. Expand the scope of the AI Act to cover all financial services
  2. Establish a clear liability regime that holds providers of AI-powered services accountable for damages caused by an output of an AI system
  3. Conduct a regulatory gap analysis to ensure all AI-driven financial activities are adequately regulated.

Tuesday, 4 March 2025

Credit reference agencies, consumer profiling and the GDPR: the CJEU in C-203/22

On February 27, 2025, the CJEU delivered an important judgment on the interpretation of Article 15(1)(h) and Article 22 of Regulation (EU) 2016/679 on General Data Protection (GDPR) in C-203/22 CK Magistrat der Stadt Wien v Dun & Bradstreet Austria GmbH.

The facts

The mobile phone operator refused CK’s request to conclude or extend the mobile telephone contract for a monthly payment of a mere EUR 10. The refusal was justified with CK not passing a creditworthiness check with the credit reference agency D & B, which carried out an automated assessment. Unsurprisingly, CK was unhappy with the decision; her credit score was good. She brought the matter to the Austrian data protection authority and, with this, started a long way to the preliminary reference, going through various instances and avenues for protection.  

The referring court raised several questions, which the CJEU grouped into essentially two questions:

The first question

Must Article 15(1)(h) be interpreted as meaning that, in the case of automated decision-making, including profiling, within the meaning of Article 22(1), the data subject may require the controller to provide, ‘meaningful information about the logic involved’ in the decision making, which would mean an exhaustive explanation of the procedure and principles actually applied in using personal data to obtain a specific result, in this case, a creditworthiness assessment.  

According to Article 15 (h), the data subject has the right to obtain from the controller confirmation as to whether his/her personal data is being processed, information on the use of automated decision-making where applicable, including profiling, referred to in Article 22(1) and (4), and meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

Article 22 provides that the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, and that certain data enlisted in Article 9(1) GDPR such as racial or ethnic origin, religious beliefs cannot be considered in data processing.

Profiling, in this context, means automated processing of personal data, consisting of using personal data to analyse or predict the consumer's economic situation.

In its analysis, the CJEU first turned to a literal interpretation of the wording of Article 15 (h) and concluded that the concept of ‘meaningful information’ under that provision may have various meanings in different language versions of GDPR, which should be taken to be complementary to each other. In addition, the ‘logic involved’ in automated decision-making, which constitutes the subject matter of ‘meaningful information’ is capable of covering a wide range of ‘logics’ concerning the use of personal data and other data with a view to obtaining a specific result by automated means. The CJEU held, that the provision covers all relevant information concerning the procedure and principles relating to the use, by automated means, of personal data with a view to obtaining a specific result.

The CJEU next turned to contextual analysis of the concept of ‘meaningful information about the logic involved’, within the meaning of Article 15(1)(h). In this analysis the CJEU looked at the  Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679 and other provisions of the GDPR providing information duties of data controllers. The CJEU concluded that information duties relate to all relevant information that should be provided in clear, concise, transparent, intelligible and easily accessible form, using plain and clear language

Finally, the CJEU looked at the purpose of the provision, asserting that the purpose of the data subject’s right to obtain the information provided for in Article 15(1)(h) is to enable him or her to effectively exercise the rights conferred on him or her by Article 22(3), namely, the right to express his or her point of view and to contest the relevant decision. This, in turn, requires the right to obtain an explanation of the decision.

The CJEU then concluded that under Article 15(1)(h) the right to obtain ‘meaningful information about the logic involved’ in automated decision-making must be understood as a right to an explanation of the procedure and principles actually applied in order to use, by automated means, the personal data of the data subject with a view to obtaining a specific result, such as a credit profile. In order to enable the data subject to effectively exercise the rights conferred on him/her by the GDPR and, in particular, Article 22(3), that explanation must be provided by means of relevant information in a concise, transparent, intelligible and easily accessible form. Notably, the court further provided guidance on what is considered to be ‘meaningful information about the logic involved’ in automated decision-making. The procedures and principles actually applied must be explained in such a way that the data subject can understand which of his/her personal data have been used in the automated decision-making and the extent to which a variation in the personal data taken into account would have led to a different result. The requirements of Article 15(h) cannot be met by the mere communication of a complex mathematical formula, such as an algorithm, or by the detailed description of all the steps in automated decision-making since neither of those would constitute a sufficiently concise and intelligible explanation.

Second legal question

Another important contribution of the present judgment is the consideration of the relationship between Article 15(1)(h) and Directive 2016/943 on trade secrets, given that D&B argued that the logic of their automated decision-making, including what information is considered in which way, is a trade secret and should, therefore, not be disclosed.  

The CJEU highlighted that the protection of personal data is not an absolute right. Restrictions are possible of the scope of the obligations and rights provided for in, inter alia, Article 15 of the GDPR, but only when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate to safeguard the protection of the rights and freedoms of others. However, the result of any consideration on the limits of the protection of personal rights should not be a refusal to provide all information to the data subject.

The CJEU concluded that Article 15(1)(h) must be interpreted as meaning that, where the controller takes the view that the information to be provided to the data subject is a trade secrets, within the meaning of point 1 of Article 2 of Directive 2016/943, that controller is required to provide the allegedly protected information to the competent supervisory authority or court, which must balance the rights and interests at issue with a view to determining the extent of the data subject’s right of access provided for in Article 15 of the GDPR.

Our analysis

This decision is significant in addressing the long-standing problem of the lack of transparency in automated decision-making by credit reference agencies,  an important problem in the EU. Given that in most countries we have access to our credit reports we can know what data is considered in their decision making in producing a credit score and a credit report, however, credit reference agencies have refused disclosing the way this data is processed, the logic behind their decision making, in what way and to what extent various data is considered (weighted) in their decision making.  Although based on this decision, consumers are still not entitled to get hold of that information directly, but a first step has been made by mandating disclosure to the relevant authority who then makes a decision on whether or not to disclose it to the consumer, balancing the rights and interests of the two parties. This and other judgments of the CJEU (see C-634/21 SCHUFA Holding) may be gradually bringing transparency into this traditionally very untransparent area.

As credit reference agencies nowadays use artificial intelligence for automated decision-making, the judgment is relevant for advancing transparency considerations of AI systems.

Finally, given that the judgment tackles the operation of credit reference agencies, which are frequently used by creditors to assess the affordability of loan applications, it is relevant for responsible lending rules in Directive 2023/2225 on consumer credit (CCD2), which in Article 18 refers to creditworthiness assessment based on automated processing of personal data.