Showing posts with label gdpr. Show all posts
Showing posts with label gdpr. Show all posts

Tuesday, 2 September 2025

Key GDPR Fines in Mid-2025: Luka (Replika), TikTok, and ING Bank Śląski

This post discusses three recent decisions of Data Protection Authorities imposing fines for GDPR infringements on Luka Inc., TikTok, and ING Bank Śląski. While most of our analyses usually focus on judgments of the Court of Justice of the European Union, in this case we turn to decisions of national authorities. Such decisions tend to attract significant attention, either because of the seriousness of the violations or the high amounts of the penalties, which makes them a frequent subject of debate. The three cases selected meet these criteria and, moreover, were issued within the past few months. They are also directly relevant to consumers, as they highlight risks that many of us face in everyday life when using apps or online services where personal data may be mishandled.
 
 
Luka Inc. 

On 10 April 2025, the Italian Data Protection Authority (Garante per la protezione dei dati personali) issued a decision against Luka Inc., the U.S. company behind the Replika chatbot. Replika is marketed as an AI “companion” designed to boost users’ mood and wellbeing, and can be set up as a friend, mentor, therapist or even a romantic partner. But according to the Garante, the way Luka handled users’ personal data fell far short of what the GDPR requires.

The investigation showed that Replika’s privacy policy did not clearly identify the legal ground for the many different ways in which users’ data were processed – for example, data used for running the chatbot versus data used for developing the large language model behind it. Instead of specifying purposes and corresponding legal bases, the policy only gave vague, generic statements like: “We care about the protection and confidentiality of your data. We therefore only process your data to the extent that: It is necessary to provide the Replika services you are requesting, you have given your consent to the processing, or we are otherwise authorized to do so under the data protection laws” (btw – doesn’t that sound familiar from many privacy policies?). This lack of granularity made it impossible for users to understand how their data were really being used, in breach of Articles 5(1)(a) and 6 GDPR.

What’s more, the privacy notice was only available in English, even though the service was offered in Italy. It also failed to explain key points required under GDPR: what kinds of data were collected, how long they were stored, whether data were transferred outside the EU, and for what purpose. Some statements were even misleading, for instance, suggesting that personal data might be transferred to the U.S., while the company later claimed no such transfers took place. Such gaps and contradictions meant that users could not make informed choices about their data.

However, the most troubling finding was that the Garante concluded Luka had failed to implement effective safeguards for children. Although the service was formally intended for adults, it lacked genuine age-verification mechanisms. Registration required only a name, email address, and gender, which allowed minors to create accounts. Even when users declared they were under 18, no technical barrier prevented them from accessing the platform. In practice, this meant that children could be exposed to age-inappropriate content, including sexually explicit material. Moreover, even after updates to the privacy policy, technical testing showed that under-18 users could still bypass the age restriction simply by editing their profile. 
 
For these violations, the Garante imposed an administrative fine of EUR 5,000,000, representing half of the maximum amount available under Article 83(5) GDPR.
 
 
TikTok Technology Limited
 
Another significant decision was issued by the Irish Data Protection Commission (DPC) in May 2025 against TikTok Technology Limited. Although the full text of the decision has not yet been published, the official press release provides insight into the reasons for the sanction.
 
The inquiry examined both the lawfulness of TikTok’s transfers of European users’ personal data to China and the adequacy of the company’s transparency regarding those transfers. The DPC concluded that TikTok had infringed the GDPR in two key respects.
 
First, the Commission found that TikTok’s transfers of user data to China violated Article 46(1) GDPR. The company failed to verify, guarantee, and demonstrate that personal data of European users – remotely accessed by staff in China – was afforded a level of protection essentially equivalent to that required within the EU. TikTok’s own assessments of Chinese law highlighted serious divergences from EU standards, particularly risks under the Anti-Terrorism Law, the Counter-Espionage Law, and the National Intelligence Law. Nevertheless, the company did not adequately address these risks or ensure that its contractual safeguards were effective.
 
Second, the DPC held that TikTok had not complied with the information duties set out in Article 13(1)(f) GDPR. Earlier versions of its privacy policy (in force between July 2020 and December 2022) failed to identify the countries involved in data transfers and did not explain the nature of the processing – for instance, that personnel in China could remotely access data stored in Singapore and the United States. This lack of clarity prevented users from understanding who could access their data and under what conditions.
 
The decision imposed not only administrative fines but also corrective measures. TikTok was given six months to bring its practices into compliance, failing which data transfers to China would have to be suspended altogether. The total fine amounted to EUR 530,000,000, comprising EUR 485,000,000 for the unlawful transfers and EUR 45,000,000 for the lack of transparency.
 
 
ING Bank Śląski
 
The third discussed decision was delivered on 23 July 2025 by the Polish Data Protection Authority (UODO) against ING Bank Śląski S.A., which was fined PLN 18,416,400 (around EUR 4,000,000). The case revolved around the bank’s widespread practice of copying and scanning ID cards of both existing and potential clients, even in situations where such a step was not required by law. The bank introduced this practice after the amendment of Polish anti-money laundering provisions, interpreting them as justifying the systematic copying of IDs.
 
The investigation revealed that between April 2019 and September 2020 the bank systematically scanned ID documents not only during customer onboarding, but also in contexts where no anti-money laundering (AML) obligations applied – for example, when a customer filed a complaint about an ATM. In practice, the bank’s internal procedures made the delivery of services conditional on handing over a scanned ID, leaving consumers with no real choice.
 
As emphasized in the decision, both AML law and the GDPR require banks to conduct a risk-based assessment and determine, case by case, whether copying an ID is genuinely necessary. ING failed to perform such assessments. Instead, it adopted blanket rules requiring ID copies in numerous situations, regardless of whether AML obligations applied. As a result, the bank processed extensive amounts of sensitive identifying information without a valid legal basis under Article 6 GDPR. Although no specific harm was demonstrated, the decision underscores that ID cards contain a wide range of personal data – including full name, date of birth, parents’ names, unique national ID number (PESEL), photograph, and document series. Taken together, these data significantly increase the risk of identity theft or fraudulent loans. Given that ING had millions of individual and corporate clients during the period in question, the potential consequences of such unnecessary data collection were substantial.

Sunday, 6 April 2025

Do you really need my title? The CJEU says no – a win for consumer privacy in case C‑394/23

(Source: Freepik)

Have you ever been asked about your title while purchasing something online? It’s a common practice, but most of us (consumers) don’t realise that it raises concerns from a data protection perspective, especially when the seller requires us to provide this information and does not allow us to skip the form field and place the order without disclosing our gender. This practice was challenged by the French association Mousse in proceedings against the French Data Protection Authority (Commission nationale de l’informatique et des libertés, CNIL) and the French railway operator SNCF Connect, eventually resulting in a preliminary ruling by the Court of Justice of the EU (Case C‑394/23).

The facts

SNCF Connect sells rail travel documents such as train tickets and discount cards via its website and mobile applications. When purchasing these products, customers are required to indicate their title by selecting either Monsieur (Mr) or Madame (Ms). This requirement raised Mousse’s concerns about its compliance with the General Data Protection Regulation (GDPR).

The association filed a complaint with CNIL, arguing that the collection of titles lacked a valid legal basis under Article 6(1) GDPR, violated the data minimisation principle under Article 5(1c), and failed to meet the transparency and information obligations set out in Article 13 GDPR. The CNIL rejected the complaint, concluding that collecting titles was justified as necessary for the performance of a contract under Article 6(1b) and aligned with accepted norms of personalised communication (paras. 13–15). Mousse appealed the decision to the French Conseil d’État, which referred several preliminary questions to the CJ.

The ruling

The Court of Justice essentially said “no” to this kind of data processing. It did not declare that the processing of title-related personal data is categorically prohibited under the GDPR, but stressed that in the specific context of this case, it “does not appear to be either objectively indispensable or essential to enable the proper performance of the contract” concluded with the consumer (para. 39).

Here are the key takeaways from the judgment:

1. The Court focused its analysis on Articles 6(1b) and 6(1f) GDPR, which establish when data processing is lawful. Article 6(1b) allows processing when it is “necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract”, while Article 6(1f) permits it if it serves a legitimate interest of a controller or a third party, provided that interest is not overridden by the data subject’s fundamental rights and freedoms.

The Court made it clear that when relying on contractual necessity under Article 6(1b), the controller must show that the processing is “objectively indispensable for a purpose that is integral to the contractual obligation intended for the data subject” (para. 33). In other words, the controller must demonstrate that the processing “must be essential for the proper performance of the contract concluded between the controller and the data subject and, therefore, that there are no workable, less intrusive alternatives” (para. 34). Applying this to the case at hand, the Court rejected the CNIL’s and SNCF’s claim that collecting customers’ titles is necessary for personalised commercial communication, and that such communication is an essential part of the contract. According to the Court:

“Commercial communication may constitute a purpose forming an integral part of the contractual service concerned, since the provision of such a rail transport service involves, in principle, communicating with the customer in order, inter alia, to send him or her a travel document by electronic means, to inform him or her of any changes affecting the corresponding journey, and to allow exchanges with the after-sales service. That communication may require adherence to accepted practices and may include, in particular, forms of addressing a customer, in order to show that the undertaking concerned respects its customer and thereby to safeguard that undertaking’s brand image. However, it appears that such communication does not necessarily have to be personalised based on the gender identity of the customer concerned” (paras. 37–38).

In short, personalising content is not necessary if the same service can be provided in a standard, non-personalised way. The controller could instead use more privacy-friendly alternatives, such as generic and inclusive forms of address that do not rely on the consumer’s assumed gender identity (para. 40).

2. Furthermore, the systematic and generalized processing of consumers’ titles cannot be justified by the mere fact that some of them use the services of night trains, even if it is necessary to adapt transport services for night trains, which have carriages reserved for persons with the same gender identity, and to assist passengers with disabilities. In the Court's view, it does not justify the collection of titles of all customers, including those who travel during the daytime or who do not have disabilities. Such a practice is disproportionate and contrary to the principle of data minimization (para. 42).

3. As it regards the ‘legitimate purposes’ prerequisite, the Court found that personalised commercial communication can be achieved by using customers’ first and last names alone, since requiring their title or gender identity is not strictly necessary, particularly in light of the data minimisation principle (para. 55). Moreover, it’s important to note that Article 6(1f) GDPR does not allow “common practices or social conventions” to justify the necessity of processing personal data (para. 56).

4. Finally, the fact that data subjects may object to the processing under Article 21 GDPR is irrelevant in this context. According to the Court, this opt-out mechanism should not be taken into the account while assessing whether the original data collection was lawful (para. 70). To put it simply, controllers cannot justify collecting unnecessary personal data by simply allowing individuals to object afterward. While the right to object is an important safeguard, it does not give controllers a free pass to collect data first and handle objections later.

Our comment

The judgment has a direct impact on the practices of certain data controllers who, without a valid legal basis, collect excessive data concerning consumers’ titles and gender identity, where such information is not necessary for the purposes of processing. The CJ ruling serves as a clear reminder that personal data must be processed in accordance with the principle of data minimisation, meaning that only data strictly necessary to achieve the intended purpose should be collected and used.

Importantly, the Court did not declare that the collection of such data is absolutely prohibited under the GDPR. Rather, it emphasised that lawfulness depends on the specific context. For example – although not stated explicitly, this can be inferred from the reasoning – a controller may process such data on the basis of the data subject’s consent. In that case, a form used by the consumer to conclude a contract could include an optional field allowing the individual to indicate a preferred form of address. Crucially, this field would not be mandatory: if the consumer wished to provide that information, they could do so; if not, they could simply skip it without consequence. 


PS. In the context of this judgment, it is also worth drawing attention to another recent CJEU decision (case C‑247/23) which likewise concerned the processing of gender identity data. In that case, the Court reaffirmed that one of the fundamental duties of a data controller is to ensure the accuracy of the personal data processed. If a data subject exercises its right to rectification, the controller should not impose disproportionate administrative burdens that unjustifiably hinder the exercise of that right. The case involved a request to update the gender information in a public register maintained by a Hungarian authority. The individual, registered as female, sought to have the record amended to reflect his male gender, submitting medical documentation to support the request. The authority, however, demanded proof of surgical gender reassignment – a requirement the CJ found excessive and incompatible with the essence of fundamental rights, including the rights to personal integrity and respect for private life.

Tuesday, 4 March 2025

Credit reference agencies, consumer profiling and the GDPR: the CJEU in C-203/22

On February 27, 2025, the CJEU delivered an important judgment on the interpretation of Article 15(1)(h) and Article 22 of Regulation (EU) 2016/679 on General Data Protection (GDPR) in C-203/22 CK Magistrat der Stadt Wien v Dun & Bradstreet Austria GmbH.

The facts

The mobile phone operator refused CK’s request to conclude or extend the mobile telephone contract for a monthly payment of a mere EUR 10. The refusal was justified with CK not passing a creditworthiness check with the credit reference agency D & B, which carried out an automated assessment. Unsurprisingly, CK was unhappy with the decision; her credit score was good. She brought the matter to the Austrian data protection authority and, with this, started a long way to the preliminary reference, going through various instances and avenues for protection.  

The referring court raised several questions, which the CJEU grouped into essentially two questions:

The first question

Must Article 15(1)(h) be interpreted as meaning that, in the case of automated decision-making, including profiling, within the meaning of Article 22(1), the data subject may require the controller to provide, ‘meaningful information about the logic involved’ in the decision making, which would mean an exhaustive explanation of the procedure and principles actually applied in using personal data to obtain a specific result, in this case, a creditworthiness assessment.  

According to Article 15 (h), the data subject has the right to obtain from the controller confirmation as to whether his/her personal data is being processed, information on the use of automated decision-making where applicable, including profiling, referred to in Article 22(1) and (4), and meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

Article 22 provides that the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, and that certain data enlisted in Article 9(1) GDPR such as racial or ethnic origin, religious beliefs cannot be considered in data processing.

Profiling, in this context, means automated processing of personal data, consisting of using personal data to analyse or predict the consumer's economic situation.

In its analysis, the CJEU first turned to a literal interpretation of the wording of Article 15 (h) and concluded that the concept of ‘meaningful information’ under that provision may have various meanings in different language versions of GDPR, which should be taken to be complementary to each other. In addition, the ‘logic involved’ in automated decision-making, which constitutes the subject matter of ‘meaningful information’ is capable of covering a wide range of ‘logics’ concerning the use of personal data and other data with a view to obtaining a specific result by automated means. The CJEU held, that the provision covers all relevant information concerning the procedure and principles relating to the use, by automated means, of personal data with a view to obtaining a specific result.

The CJEU next turned to contextual analysis of the concept of ‘meaningful information about the logic involved’, within the meaning of Article 15(1)(h). In this analysis the CJEU looked at the  Guidelines on automated individual decision-making and profiling for the purposes of Regulation 2016/679 and other provisions of the GDPR providing information duties of data controllers. The CJEU concluded that information duties relate to all relevant information that should be provided in clear, concise, transparent, intelligible and easily accessible form, using plain and clear language

Finally, the CJEU looked at the purpose of the provision, asserting that the purpose of the data subject’s right to obtain the information provided for in Article 15(1)(h) is to enable him or her to effectively exercise the rights conferred on him or her by Article 22(3), namely, the right to express his or her point of view and to contest the relevant decision. This, in turn, requires the right to obtain an explanation of the decision.

The CJEU then concluded that under Article 15(1)(h) the right to obtain ‘meaningful information about the logic involved’ in automated decision-making must be understood as a right to an explanation of the procedure and principles actually applied in order to use, by automated means, the personal data of the data subject with a view to obtaining a specific result, such as a credit profile. In order to enable the data subject to effectively exercise the rights conferred on him/her by the GDPR and, in particular, Article 22(3), that explanation must be provided by means of relevant information in a concise, transparent, intelligible and easily accessible form. Notably, the court further provided guidance on what is considered to be ‘meaningful information about the logic involved’ in automated decision-making. The procedures and principles actually applied must be explained in such a way that the data subject can understand which of his/her personal data have been used in the automated decision-making and the extent to which a variation in the personal data taken into account would have led to a different result. The requirements of Article 15(h) cannot be met by the mere communication of a complex mathematical formula, such as an algorithm, or by the detailed description of all the steps in automated decision-making since neither of those would constitute a sufficiently concise and intelligible explanation.

Second legal question

Another important contribution of the present judgment is the consideration of the relationship between Article 15(1)(h) and Directive 2016/943 on trade secrets, given that D&B argued that the logic of their automated decision-making, including what information is considered in which way, is a trade secret and should, therefore, not be disclosed.  

The CJEU highlighted that the protection of personal data is not an absolute right. Restrictions are possible of the scope of the obligations and rights provided for in, inter alia, Article 15 of the GDPR, but only when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate to safeguard the protection of the rights and freedoms of others. However, the result of any consideration on the limits of the protection of personal rights should not be a refusal to provide all information to the data subject.

The CJEU concluded that Article 15(1)(h) must be interpreted as meaning that, where the controller takes the view that the information to be provided to the data subject is a trade secrets, within the meaning of point 1 of Article 2 of Directive 2016/943, that controller is required to provide the allegedly protected information to the competent supervisory authority or court, which must balance the rights and interests at issue with a view to determining the extent of the data subject’s right of access provided for in Article 15 of the GDPR.

Our analysis

This decision is significant in addressing the long-standing problem of the lack of transparency in automated decision-making by credit reference agencies,  an important problem in the EU. Given that in most countries we have access to our credit reports we can know what data is considered in their decision making in producing a credit score and a credit report, however, credit reference agencies have refused disclosing the way this data is processed, the logic behind their decision making, in what way and to what extent various data is considered (weighted) in their decision making.  Although based on this decision, consumers are still not entitled to get hold of that information directly, but a first step has been made by mandating disclosure to the relevant authority who then makes a decision on whether or not to disclose it to the consumer, balancing the rights and interests of the two parties. This and other judgments of the CJEU (see C-634/21 SCHUFA Holding) may be gradually bringing transparency into this traditionally very untransparent area.

As credit reference agencies nowadays use artificial intelligence for automated decision-making, the judgment is relevant for advancing transparency considerations of AI systems.

Finally, given that the judgment tackles the operation of credit reference agencies, which are frequently used by creditors to assess the affordability of loan applications, it is relevant for responsible lending rules in Directive 2023/2225 on consumer credit (CCD2), which in Article 18 refers to creditworthiness assessment based on automated processing of personal data. 

Tuesday, 2 April 2024

How the CJEU's ruling in C-604/22 may transform online advertising: a closer look at the IAB Europe case

In March, the CJEU issued a ruling (Case C-604/22 IAB Europe) that has sparked a lot of discussion. The ruling addresses certain practices related to online advertising in Europe, particularly the collection of personal data for the purpose of behavioural advertising.

Facts of the case

The Interactive Advertising Bureau Europe (IAB Europe) is a non-profit association that represents digital advertising and marketing businesses at the European level. IAB Europe's members include companies that generate significant revenue by selling advertising space on websites or applications. Several years ago the association developed the Transparency & Consent Framework (TCF) to promote General Data Protection Regulation (GDPR) compliance when using the OpenRTB protocol (a popular system used for "real-time bidding", which means it quickly and automatically auctions off user information to buy and sell ad space on the internet). The TCF consists of guidelines, technical specifications, instructions, protocols, and contractual obligations. The framework is designed to ensure that when users access a website or application containing advertising space, technology businesses representing thousands of advertisers can instantly bid for that space using algorithms to display targeted advertising tailored to the individual's profile.
Image by "storyset" (Freepik)

The TCF was presented as a solution to bring the auction system into compliance with GDPR (para. 21, 22). However, before displaying targeted advertisements, the user's prior consent must be obtained. When a user visits a website or application, a Consent Management Platform (CMP) appears in a pop-up window. The CMP enables users to give their consent to collect and process their personal data for pre-defined purposes, such as marketing or advertising, or to object to various types of data processing or sharing of data based on legitimate interests claimed by providers, as per Article 6(1f) of the GDPR. The personal data relates to the user's location, age, search history, and recent purchase history (para. 24). In other words - the TCF facilitates the capture of user preferences through the CMP. And these preferences are coded and stored in a "TC string" (which is a combination of letters and characters), and then shared with organizations participating in the OpenRTB system, indicating what the user has consented/ objected to. The CMP places a cookie on the user's device, and when combined with the TC string, the IP address of the user can identify the author of the preferences. Thus the TCF plays a crucial role in the architecture of the OpenRTB system as it is the expression of users' preferences regarding potential vendors and various processing purposes, including the offering of tailor-made advertisements (para. 25, 26).

Since 2019, the TCF model has faced numerous complaints to the Belgian Data Protection Authority (DPA) regarding its GDPR compliance. IAB Europe was criticized for providing users with information through the CMP interface that was too generic and vague, preventing users from fully understanding the nature and scope of data processing and thereby maintaining control over their personal data. Furthermore, IAB Europe was accused of failing to fulfil certain obligations of a data controller, including ensuring the lawfulness of processing, accountability, security, and adhering to data protection privacy by design and by default rules (more details about the proceedings can be found on the DPA's website). Consequently, the DPA concluded that IAB Europe did not meet its GDPR obligations and imposed an administrative fine of €250,000. Additionally, it mandated corrective actions to align the TCF with GDPR standards. 

IAB Europe disagreed with the decision and challenged it before the Belgian court. According to IAB Europe, it should not be considered a data controller for recording the consent signal, objection, and preferences of individual users through a TC string. Thus the association should not be obliged to follow data controllers' obligations under GDPR. IAB Europe also disagreed with the DPA's finding that the TC string is personal data within the meaning of Article 4(1) of the GDPR. Specifically, IAB Europe argued that only the other participants in the TCF could combine the TC String with an IP address to convert it into personal data, that the TC String is not specific to a user and that IAB Europe cannot access the data processed in that context by its members (para. 28).

CJ's ruling


The Court has confirmed the key aspects of the DPA’s decision, emphasizing, among other things that:


1. the TC String holds information that pertains to an identifiable user and, thus, qualifies as personal data under Article 4(1) of the GDPR. Even if it doesn't contain any direct factors that allow the data subject to be identified, it does contain the preferences of a specific user relating to their consent to data processing. This information is considered to be related to a natural person (para. 43). If the information in a TC String is linked to an identifier, such as the IP address of the device, it could be possible to create a profile of that user and identify a particular person (para. 44). The fact that IAB Europe cannot combine the TC String with the IP address of a user's device and doesn't have direct access to the data processed by its members is irrelevant. As the Court stated, IAB Europe can require its members to provide it with the necessary information to identify the users whose data is being processed in a TC String (para. 48). This means that IAB Europe has reasonable means to identify a particular natural person from a TC String (para. 49).

2. IAB Europe, together with its members, is considered a 'joint controller' when it determines the purposes and ways of data processing. Why? According to the Court, the TCF framework aims to ensure that the processing of personal data by certain operators that participate in the online auctioning of advertising space complies with the GDPR. Consequently, it aims to promote and allow the sale and purchase of advertising space on the Internet by such operators. It means that IAB Europe has control over the personal data processing operations for its own purposes and, jointly with its members, determines the purposes of such operations (para. 62-64). Moreover, the TCF contains technical specifications relating to the processing of the TC String, such as how CMPs need to collect users' preferences, how such preferences must be processed to generate a TC String, etc. (para. 66). If any of IAB's members do not comply with the TCF rules, IAB Europe may adopt a non-compliance and suspension decision, which could result in the exclusion of that member from the TCF (para. 65). Therefore, the Court concluded that IAB Europe also determines the means of data processing operations jointly with its members (para. 68), so it meets the criteria of a data controller under Article 4(7) of the GDPR. However, this should not automatically make IAB Europe responsible for the subsequent processing of personal data carried out by operators and third parties based on information about the users' preferences recorded in a TC String (para. 74-76).

What could be the consequences of the ruling? 

The Court confirmed that the IAB Europe, due to the role and significant influence it has over the processing of data by its members for the purposes of creating user profiles and targeting them with personalized advertising, should be held responsible for how this process is organized. And it is organized in a way that is hardly transparent to users. While it is up to the national court to ultimately examine the compatibility of the Belgian DPA's decision, it can be expected that the court will affirm the main conclusions of the Belgian authority's decision. 

It appears unlikely that the CJ's ruling will lead to the elimination of the intrusive pop-ups on many websites, which often rely on dark patterns and manipulative techniques to coerce consent for data processing for marketing purposes. Nevertheless, the advertising industry should place a greater emphasis on enhancing transparency and providing users with more control over their personal data. This could include the development of more user-friendly and informative consent mechanisms, making it easier for users to understand what they are consenting to and how to exercise their rights over their data. The ruling is also expected to impose further restrictions on behavioural advertising practices, particularly those dependent on real-time bidding and the widespread sharing of personal data without explicit, informed consent from users. 

Saturday, 28 October 2023

EDPS Opinion on AI Act proposal

The proposal for the Artificial Intelligence Act has caused a lot of heated discussion as it reaches its final stage. Recently, the European Data Protection Supervisor (EDPS) issued an opinion on the current version of the AI Act proposal*, pointing out several legal uncertainties from a data protection perspective. This is the second EDPS opinion about the forthcoming AI Act, following one issued jointly with the European Data Protection Board shortly after the proposal was revealed.
Photo by julien Tromeur on Unsplash

The EDPS takes a tough stance as regards some of the solutions envisaged in the proposal. For instance, the authority once again emphasized that classifying several uses of AI as "high risk" is not enough in cases where such uses pose unacceptable risks to fundamental rights (see para. 7 of the opinion). This includes a.o.:

  • any use of AI to carry out any type of "social scoring";
  • any use of AI for automated recognition of human features in publicly accessible spaces, such as of faces, gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals;
  • the use of AI to infer emotions of a natural person except for certain well-specified use-cases, namely for health or research purposes;
  • any use of AI systems categorising individuals from biometrics into clusters according to ethnicity, gender, political or sexual orientation, or other grounds for discrimination prohibited under Article 21 of the EU Charter of Fundamental Rights.

According to the EDPS, such uses should be prohibited as they are intrusive and affect human dignity.

The EDPS also notes that the AI Act proposal exempts operators of high-risk AI systems already on the market or in use before the AI Act's applicability, except in cases when these systems are subject to significant changes in their design or purpose or in case of "substantial modifications" (para. 12 of the opinion, see also Article 83(2) of the AI Act proposal). However, the EDPS finds this solution unclear, leading to legal uncertainty and some high-risk AI systems never falling within the scope of the AI Act. The EDPS recommends removing this exemption and applying the AI Act to existing high-risk AI systems on the date of its applicability.

What is more, the EDPS suggests that the notion of AI "providers" should be further clarified, and probably (explicitly?) include AI operators who retrain pre-trained AI systems. Although training is a fundamental part of AI development, the current proposal does not clearly state whether activities such as retraining or continuous training should be considered as part of AI system 'development'. As a result, it is uncertain whether operators taking part in such activities could be assigned the status of "providers" of AI systems (para. 15-19 of the opinion). 

Finally, the authority shared specific recommendations on how to clarify the proposal's provisions on EDPS roles and tasks as a notified body, market surveillance authority and competent authority for the supervision of the development, provision or use of AI systems by EU institutions, bodies, offices and agencies (para. 29 et seq.).

* Updated information on the legislative process you can find here.

Wednesday, 19 April 2023

EDPB updated guidelines on right of access to personal data

The European Data Protection Board (EDPB) a few days ago published updated (second version) guidelines on the rights of data subjects, specifically the right of access to personal data. Any person whose personal data is processed is entitled to the right of access under Art. 15 of the GDPR. The right of access to data is considered one of the key rights under the GDPR, as it allows you to maintain control over what personal data is being processed, by whom, on what legal basis, to whom it has been made available, etc. Although the guidelines are primarily addressed to data controllers, they contain valuable tips for data subjects, providing insight into the actual scope of our rights. It's good to familiarize yourself with them, because as consumers, we leave digital footprints almost everywhere, and as a result, it's good to know what rights we have.

Just not to sound groundless, here are some noteworthy points from the guidelines: 

1. If you ask for access to your data the controller should give you access to all your personal data that are processed, unless you expressly limit your request (e.g. to identification data or data concerning a contract concluded on a particular day). The controller is not entitled to narrow the scope of your request arbitrarily, but may ask you to specify the request if he processes a large quantity of data.

2. Before granting access to personal data, the controller should confirm your identity in order to ensure the security of processing and minimise the risk of unauthorised disclosure of personal data. In this regard the EDPB emphasized that "as a rule, the controller cannot request more personal data than is necessary to enable this authentication, and that the use of such information should be strictly limited to fulfilling the data subjects’ request" (p. 25). The GDPR does not precise how to identify the data subject, so it is up to the controller to decide which authentication method is the most appropriate. However, the method must be proportionate to the circumstances of the processing, including the type of personal data being processed (e.g. special categories of data), the context within which the request is being made, potential damage that could result from improper disclosure of data). It happens that controllers fail to meet this requirement and choose methods that are convenient for them, but disproportionate. The EDPB states: "In practice, authentication procedures often exist and controllers do not need to introduce additional safeguards to prevent unauthorised access to services. In order to enable individuals to access the data contained in their accounts (such as an e-mail account, an account on social networks or online shops), controllers are most likely to request the logging through the login and password of the user, which in such cases should be sufficient to authenticate a data subject. [...] Consequently, it is disproportionate to require a copy of an identity document in the event where the data subject making a request is already authenticated by the controller. [...] Taking into account the fact, that many organisations (e.g. hotels, banks, car rentals) request copies of their clients’ ID card, it should generally not be considered an appropriate way of authentication" (p. 27).

3. Information requested as part of data access right should be provided to the data subject without undue delay and in any event within one month of receipt of the request. This deadline can be extended by a maximum of two months taking into account the complexity and the number of the requests that the controller receives. In such a situation the data subject must be informed about the reasons for delay. But the rule is that the controller should act "without undue delay", which means that the information should be given as soon as possible - "if it is possible to provide the requested information in a shorter amount of time than one month, the controller should do so" (p. 50).

4. Sometimes the controller may limit or refuse to give access to personal data. According to Art. 15(4) GDPR, the right to obtain a copy of data shall not adversely affect the rights and freedoms of others. Another restriction results from Art. 12(5) GDPR which enables controllers to override requests that are manifestly unfounded or excessive, in particular because of their repetitive character. These concepts must be interpreted narrowly. Data access right may be exercised more the once, but as it was indicated in recital 63 of the GDPR - "at reasonable intervals". It is not possible to determine in advance how often it is permissible to make requests for access to data, because it depends on processing circumstances. The EDPB remarks that "the more often changes occur in the database of the controller, the more often data subjects may be permitted to request access to their personal data without it being excessive". For example, "in the case of social networks, a change in the data set will be expected at shorter intervals than in the case of land registers or central company registers" (p. 56).

These are just a few examples worth keeping in mind. For more, I recommend checking out the guidelines. 

Saturday, 28 January 2023

It is your right to know the actual identity of recipients to whom your personal data have been or will be disclosed (C-154/21 Österreichische Post)

The General Data Protection Regulation (GDPR) provides individuals (data subjects) with a number of rights. These are listed in Chapter III of the GDPR and include, inter alia, the right to be informed of the processing of personal data (Articles 13 and 14 of the GDPR), right of access (Article 15 of the GDPR), right to rectification (Article 16 of the GDPR), right to erasure (Article 17 of the GDPR) etc. In mid-January 2023, the Court of Justice in Case C-154/21 Österreichische Post answered a question concerning one of those rights, namely the right of access.

As stated in Article 15(1) of the GDPR „the data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information: […]

(c)  the recipients or categories of recipient to whom the personal data have been or will be disclosed, in particular recipients in third countries or international organisations; […].

The dispute concerned the fact that the data subject requested from the controller the actual identity of the recipients to whom he was disclosing his personal data. However, the controller did not reveal the identity of the recipients, but informed the data subject of the "categories of recipients", indicating that they were „customers, including advertisers trading via mail order and stationary outlets, IT companies, mailing list providers and associations such as charitable organisations, non-governmental organisations (NGOs) or political parties” (para. 20). 

Indeed, doubts arise when applying Article 15(1) of the GDPR in practice. The main question is whether it is necessary to inform about the particular recipients of the data, or would it be enough to notice about general categories of these recipients? Similar doubts arise in the context of Articles 13(1e) and 14(1e) of the GDPR, which oblige the controller, as part of its information obligations performed at the time of data collection, to inform about "the recipients or categories of recipients of the personal data, if any".

In the Court's view, Article 15(1) of the GDPR gives the right to be informed about the specific recipients of personal data and thus to know their actual identity. The Court cites several arguments in this regard:

(1) The data subjects should be guaranteed the right to know and be informed about the processing of their personal data, in particular about the recipients to whom the data are made available. This is emphasised in Recital 63 of the GDPR, which, nota bene, does not refer to the right to information about "categories of recipients of data", but generally to the right to information about "recipients of personal data" (para. 33).

(2) The controller must process personal data in accordance with the principle of transparency, which from the data subject's perspective means that information on how his or her personal data is processed should be easily accessible and comprehensible (para. 35).

(3) „Article 15 of the GDPR lays down a genuine right of access for the data subject, with the result that the data subject must have the option of obtaining either information about the specific recipients to whom the data have been or will be disclosed, where possible, or information about the categories of recipient” (para. 36).

(4) The right of access is often exercised to verify the accuracy of the data or the lawfulness of the processing. In this sense, the right of access frequently determines further actions of the data subject, i.e. the exercise of other rights under the GDPR, e.g. the right to erasure or the right to object to processing. Therefore, the complete and diligent exercise of the right of access is essential to guarantee the effectiveness of the data subject's rights (para. 38).

However, the Court reminded that the right to the protection of personal data is not an absolute right and is subject to limitations. The controller, despite an express request by the data subject, does not have to provide information on the identity of the recipients of the data if "in specific circumstances it is not possible to provide information on specific recipients" (e.g. when it is not possible to identify those recipients - para. 51), and furthermore when the data subject's request is unjustified or excessive in nature [as stated in Article 12(5b) GDPR].

In practice, this means that each request will have to be carefully analysed. It is certainly easier for controllers to provide general information on the categories of recipients rather than precise information on the identity of the recipients. For controllers with large datasets, who share data with many entities and receive many requests of data access, a detailed examination of data flows may be cumbersome. What the judgment lacks, in my view, is a clarification of what the 'special circumstances' that would justify a refusal to disclose the identity of data recipients could consist of. 

It appears from the CJ's reasoning that such a special circumstance may be the lack of knowledge of the future recipients (para. 48). The question is whether such a circumstance could be the difficulty of stating all data recipients due to their large number. In practice, this is a common problem for controllers. Yet, such an interpretation does not seem to be acceptable. It can be said that the Court has spread a protective umbrella over data subjects, obliging controllers to be more accurate, transparent in their processing and to provide reliable and complete information to data subjects. This is a good signal for data subjects, especially consumers of various online services, as the judgment provides clear grounds for demanding detailed information about the processing of personal data. 

Saturday, 31 December 2022

December wrap-up of data protection cases (Google, Österreichische Datenschutzbehörde and Pankki S)

The end of the month (and the end of the year as well) is a good moment for summaries. This time we are taking a closer look at events in the area of data protection law. December was a month with a couple of interesting events, so here is a brief recap. 

Dereferencing allegedly inaccurate content (C-460/20 Google)

The case concerned two executives of a group of investment companies (a board member and a proxy) who asked Google to remove search results linking their names to certain articles criticising the group's investment model. They exercised the so-called right to be forgotten, guaranteed under Article 17(1) of the GDPR, claiming that the information presented contained false claims and defamatory opinions. They also wanted Google to remove their thumbnail images from the search results. Google rejected these requests, arguing that it does not know whether the information contained in the articles is true or not.

In cases involving the erasure of data from a search engine operator's search results, two rights usually collide: the public's right of access to information (especially about persons holding public positions) and the individual's right to protection of his or her personal data, including the right to erasure, protection of his or her good name, image, etc. The same problems were considered in this case, as we wrote about when reporting on the AG's opinion issued in the proceedings. In the ruling of 8th December 2022 the Court held that the person requesting the deletion of data is obliged to show that the information is manifestly inaccurate. "However, in order to avoid imposing on that person an excessive burden which is liable to undermine the practical effect of the right to de-referencing, that person has to provide only evidence that, in the light of the circumstances of the particular case, can reasonably be required of him or her to try to find in order to establish that manifest inaccuracy" (para. 68). It means that such a person cannot be required to present a judicial decision made against the publisher of the website in question, even in the form of a decision given in interim proceedings, since it would be an unreasonable burden imposed on such a person. At the same time "the operator of the search engine concerned cannot be required to investigate the facts and, to that end, to organise an adversarial debate with the content provider seeking to obtain missing information concerning the accuracy of the referenced content" (para. 71). Therefore, if the person who made a request for de-referencing submits relevant and sufficient evidence showing the manifest inaccuracy of the information found in the referenced content, the operator of the search engine is required to accede to that request for de-referencingBut an operator should not grant a request if the inaccurate character of the information is not obvious in the light of the evidence presented (para. 72&73). 

As regards the thumbnails the Court concluded that "a separate weighing-up of competing rights and interests is required depending on whether the case concerns, on the one hand, articles containing photographs which are published on an internet page and which, when placed into their original context, illustrate the information provided in those articles and the opinions expressed in them, or, on the other hand, photographs displayed in the list of results in the form of thumbnails by the operator of a search engine outside the context in which they were published on the original internet page" (para. 101). The Court also stated that the informative value of those images should be taken into account independently of the context of their publication on the website from which they originate, nevertheless taking into account all the content that directly accompanies the display of those images in the search results and that can explain the informative value of those images (para. 108).

The concept of a "copy of personal data" under the Article 15(3) of the GDPR. AG Pitruzzella opinion on Österreichische Datenschutzbehörde case (C487/21)

The dispute arose over the interpretation of Article 15(3) of the GDPR, which provides that a data subject, as part of the right of access to one's personal data, may obtain a copy of that data. The complainant requested an exact copy of the data processed by the controller, including full copies of documents containing his personal data. However, the controller provided only some of the requested information as an aggregate that reproduced the stored personal data of the data subject in a table broken down by name, date of birth, street, postal code, and place, and in a statement summarising corporate functions and powers of representation. As part of the proceedings, the national court decided to refer several questions concerning the interpretation of Article 15(3) of the GDPR to the Court. 

On 15 December 2022, the AG delivered an opinion stating that the concept of “copy” referred to in Article 15(3) of the GDPR must be understood as "a faithful reproduction in intelligible form of the personal data requested by the data subject, in material and permanent form, that enables the data subject effectively to exercise his or her right of access to his or her personal data in full knowledge of all his or her personal data that undergo processing – including any further data that might be generated as a result of the processing, if those also undergo processing – in order to be able to verify their accuracy and to enable him or her to satisfy himself or herself as to the fairness and lawfulness of the processing so as to be able, where appropriate, to exercise further rights conferred on him or her by the GDPR". The AG underlined that this provision does not, in principle, entitle the data subject to obtain a full copy of documents containing the personal data, but, at the same time, does not exclude the need to provide that person with extracts from documents, whole documents or extracts from databases if that is necessary to ensure that the personal data undergoing processing are fully intelligible.

Right to know the identity of the persons who had access to one's personal data. AG Campos Sánchez-Bordona on Pankki S case (C-579/21)

The third case also concerned the right of access to personal data, but from a different perspective. Data subject wanted to know who exactly (among the employees of the financial institution) had access to his personal data at the time when he was a customer of that institution and an employee thereof. The controller refused to provide names of the employees arguing that Article 15 of the GDPR does not apply to log data of the institution's data processing system and that the information requested does not relate to personal data of the data subject, but to the personal data of the employees. 

The AG approved the controller's view and stated that Article 15(1) of the GDPR "does not give the data subject the right to know, from among the information available to the controller (where applicable, through records or log data), the identity of the employee or employees who, under the authority and on the instructions of the controller, have consulted his or her personal data". In justifying his opinion, he pointed out that "the identity of individual employees who have handled the processing of customer data is particularly sensitive information from a security point of view, at least in certain economic sectors" (para. 76). Disclosure of employees' data could expose them to attempts by customers of the banking institution to exert pressure and influence. Nevertheless, the AG noted that if a data subject has reasonable doubts about the integrity or impartiality of an individual who has participated on behalf of the controller in the processing of his or her data, this could justify the interest of that customer in knowing the identity of the employee in order to exercise the customer's right to take an action against that employee (para. 78; nb. in the relevant case the data subject made his request, in particular, in order to clarify the reasons for his dismissal).




Thursday, 24 November 2022

Can we seek compensation for a GDPR breach if it caused great upset or inner discomfort? The AG Opinion in C-300/21, Österreichische Post

According to Article 82(1) of the GDPR any person who has suffered material or non-material damage as a result of an infringement of the Regulation has the right to receive compensation from the controller or processor for the damage. It turns out that the exercise of this right in practice raises some questions, especially if the damage caused by the infringement would consist of a "great upset" or a "loss of confidence". Recently, the Advocate General Campos Sánchez-Bordona commented on this issue (see: case C-300/21 Österreichische Post). 

Facts of the case
The case concerns the processing of personal data by an Austrian postal company (Österreichische Post AG). The company had been collecting personal data on the Austrian public's affinities for political parties since 2017. Information on political preferences was inferred based on various socio-demographic characteristics. Such processing did not please "UI" (that's how the data subject is called by the AG in the opinion). More specifically, he did not like the way the company classified him as a person sympathizing with one of Austria's political parties. UI therefore entered into a dispute with the company, pointing out, for instance, that he had not consented to the processing of his personal data. As we read in the opinion, UI „was upset by the storage of his party affinity data and angered and offended by the affinity specifically attributed to him by Österreichische Post” (para. 10). What is more, he claimed that such a „political affinity attributed to him is insulting and shameful, as well as extremely damaging to his reputation” (para. 11). Therefore he demended compensation of EUR 1 000 in respect of non-material damage (inner discomfort).

Both the court of first instance and the appellate court rejected his claim. However, following an appeal to the Oberster Gerichtshof (Supreme Court, Austria), the court raised several doubts, referring the following questions to the Court of Justice for a preliminary ruling:

"1. Does the award of compensation under Article 82 of the GDPR also require, in addition to infringement of provisions of the GDPR, that an applicant must have suffered harm, or is the infringement of provisions of the GDPR in itself sufficient for the award of compensation?

2. Does the assessment of the compensation depend on further EU-law requirements in addition to the principles of effectiveness and equivalence?

3. Is it compatible with EU law to take the view that the award of compensation for non-material damage presupposes the existence of a consequence of the infringement of at least some weight that goes beyond the upset caused by that infringement?"


Opinion of the AG

The AG presented an interesting analysis of Article 82 of the GDPR, taking into account different types of interpretation (literal, historical, contextual and purposive). There are several important statements that deserve attention: 


1. Assuming that under Article 82 of the GDPR a data subject could be awarded compensation for a breach of the Regulation, despite the absence of any damage, would be inconsistent with the fundamental purpose of civil liability. This purpose is to compensate for the damage suffered by the data subject. If the damage could not be identified, the compensation then awarded would not fulfil the aforementioned function, but would be more like a punishment and a sanction for the infringer (paras 29-30). It is true that punitive damages may exist in both EU and national law, but the GDPR does not contain this type of reference (paras 39, 44, 49-50).


2. The AG's position is that a mere breach of the GDPR does not give rise to a presumption of automatic harm to the data subject (paras 56-59). As can be inferred from the Opinion, this is the presumption made by the parties to the proceedings, indicating that a breach leads to a loss of control over the data and thus causes harm to the data subject. However, the AG considers that not every loss of control over data necessarily leads to harm (para. 62) and, furthermore, that giving data subjects as much control over data as possible may not necessarily be derived from the GDPR provisions (para. 74). He states: „where a data subject does not consent to processing and processing is carried out without another legitimate legal basis, that is not a ground for the data subject to receive financial compensation on account of the loss of control over his or her data, as though that loss of control itself amounted to damage that is eligible for compensation” (para. 77).


3. The compensation for non-material damage regulated by Article 82 of the GDPR does not cover the mere upset that a person may feel due to a breach of Regulation 2016/679. It is up to the national courts to determine when, due to its characteristics, a subjective feeling of displeasure can be considered as a non-material damage in a given case (conclusion - para. 117).

Given the facts of the case, the AG's answers to the preliminary questions do not seem surprising. Nonetheless, some views are arguable, such as that „it is not straightforward to conclude from the GDPR that its objective is to grant data subjects control over their personal data as a right in itself” (para. 74). 

In my view, one of the primary objectives of the GDPR is precisely to give individuals control over their data, or even to 'restore' that control. This conclusion can also be drawn based on the provisions of other data flow regulations in the EU, such as the Data Governance Act* or the Data Act proposal**. It is clear that the opinion was given based on the GDPR provisions, but I guess they should not be interpreted without regard to the broader regulatory context. That said, we eagerly await the Court's final verdict.


* For instance, in recital 5 of the DGA it is stated that it "is necessary to increase trust in data sharing by establishing appropriate mechanisms for control by data subjects". A similar idea is expressed in recital 30 in the context of data intermediation services: "data intermediation services providers seek to enhance the agency of data subjects, and in particular individuals’ control over data relating to them". Maybe it is not directly indicated that the purpose of the DGA is to "grant control over data", but still this can be deduced from both the content and the particular objectives of the legal instruments adopted in the DGA. 
** See, for example, recital 78 of the proposal: "To foster further trust in the data, it is important that safeguards in relation to Union citizens, the public sector and businesses are implemented to the extent possible to ensure control over their data". Again, it is not stated expressly, but without ensuring control over data, the other objectives of the regulation will not be achieved. From this perspective, granting control over data may appear as one of the purposes.