Showing posts with label consumer data. Show all posts
Showing posts with label consumer data. Show all posts

Tuesday, 2 September 2025

Key GDPR Fines in Mid-2025: Luka (Replika), TikTok, and ING Bank Śląski

This post discusses three recent decisions of Data Protection Authorities imposing fines for GDPR infringements on Luka Inc., TikTok, and ING Bank Śląski. While most of our analyses usually focus on judgments of the Court of Justice of the European Union, in this case we turn to decisions of national authorities. Such decisions tend to attract significant attention, either because of the seriousness of the violations or the high amounts of the penalties, which makes them a frequent subject of debate. The three cases selected meet these criteria and, moreover, were issued within the past few months. They are also directly relevant to consumers, as they highlight risks that many of us face in everyday life when using apps or online services where personal data may be mishandled.
 
 
Luka Inc. 

On 10 April 2025, the Italian Data Protection Authority (Garante per la protezione dei dati personali) issued a decision against Luka Inc., the U.S. company behind the Replika chatbot. Replika is marketed as an AI “companion” designed to boost users’ mood and wellbeing, and can be set up as a friend, mentor, therapist or even a romantic partner. But according to the Garante, the way Luka handled users’ personal data fell far short of what the GDPR requires.

The investigation showed that Replika’s privacy policy did not clearly identify the legal ground for the many different ways in which users’ data were processed – for example, data used for running the chatbot versus data used for developing the large language model behind it. Instead of specifying purposes and corresponding legal bases, the policy only gave vague, generic statements like: “We care about the protection and confidentiality of your data. We therefore only process your data to the extent that: It is necessary to provide the Replika services you are requesting, you have given your consent to the processing, or we are otherwise authorized to do so under the data protection laws” (btw – doesn’t that sound familiar from many privacy policies?). This lack of granularity made it impossible for users to understand how their data were really being used, in breach of Articles 5(1)(a) and 6 GDPR.

What’s more, the privacy notice was only available in English, even though the service was offered in Italy. It also failed to explain key points required under GDPR: what kinds of data were collected, how long they were stored, whether data were transferred outside the EU, and for what purpose. Some statements were even misleading, for instance, suggesting that personal data might be transferred to the U.S., while the company later claimed no such transfers took place. Such gaps and contradictions meant that users could not make informed choices about their data.

However, the most troubling finding was that the Garante concluded Luka had failed to implement effective safeguards for children. Although the service was formally intended for adults, it lacked genuine age-verification mechanisms. Registration required only a name, email address, and gender, which allowed minors to create accounts. Even when users declared they were under 18, no technical barrier prevented them from accessing the platform. In practice, this meant that children could be exposed to age-inappropriate content, including sexually explicit material. Moreover, even after updates to the privacy policy, technical testing showed that under-18 users could still bypass the age restriction simply by editing their profile. 
 
For these violations, the Garante imposed an administrative fine of EUR 5,000,000, representing half of the maximum amount available under Article 83(5) GDPR.
 
 
TikTok Technology Limited
 
Another significant decision was issued by the Irish Data Protection Commission (DPC) in May 2025 against TikTok Technology Limited. Although the full text of the decision has not yet been published, the official press release provides insight into the reasons for the sanction.
 
The inquiry examined both the lawfulness of TikTok’s transfers of European users’ personal data to China and the adequacy of the company’s transparency regarding those transfers. The DPC concluded that TikTok had infringed the GDPR in two key respects.
 
First, the Commission found that TikTok’s transfers of user data to China violated Article 46(1) GDPR. The company failed to verify, guarantee, and demonstrate that personal data of European users – remotely accessed by staff in China – was afforded a level of protection essentially equivalent to that required within the EU. TikTok’s own assessments of Chinese law highlighted serious divergences from EU standards, particularly risks under the Anti-Terrorism Law, the Counter-Espionage Law, and the National Intelligence Law. Nevertheless, the company did not adequately address these risks or ensure that its contractual safeguards were effective.
 
Second, the DPC held that TikTok had not complied with the information duties set out in Article 13(1)(f) GDPR. Earlier versions of its privacy policy (in force between July 2020 and December 2022) failed to identify the countries involved in data transfers and did not explain the nature of the processing – for instance, that personnel in China could remotely access data stored in Singapore and the United States. This lack of clarity prevented users from understanding who could access their data and under what conditions.
 
The decision imposed not only administrative fines but also corrective measures. TikTok was given six months to bring its practices into compliance, failing which data transfers to China would have to be suspended altogether. The total fine amounted to EUR 530,000,000, comprising EUR 485,000,000 for the unlawful transfers and EUR 45,000,000 for the lack of transparency.
 
 
ING Bank Śląski
 
The third discussed decision was delivered on 23 July 2025 by the Polish Data Protection Authority (UODO) against ING Bank Śląski S.A., which was fined PLN 18,416,400 (around EUR 4,000,000). The case revolved around the bank’s widespread practice of copying and scanning ID cards of both existing and potential clients, even in situations where such a step was not required by law. The bank introduced this practice after the amendment of Polish anti-money laundering provisions, interpreting them as justifying the systematic copying of IDs.
 
The investigation revealed that between April 2019 and September 2020 the bank systematically scanned ID documents not only during customer onboarding, but also in contexts where no anti-money laundering (AML) obligations applied – for example, when a customer filed a complaint about an ATM. In practice, the bank’s internal procedures made the delivery of services conditional on handing over a scanned ID, leaving consumers with no real choice.
 
As emphasized in the decision, both AML law and the GDPR require banks to conduct a risk-based assessment and determine, case by case, whether copying an ID is genuinely necessary. ING failed to perform such assessments. Instead, it adopted blanket rules requiring ID copies in numerous situations, regardless of whether AML obligations applied. As a result, the bank processed extensive amounts of sensitive identifying information without a valid legal basis under Article 6 GDPR. Although no specific harm was demonstrated, the decision underscores that ID cards contain a wide range of personal data – including full name, date of birth, parents’ names, unique national ID number (PESEL), photograph, and document series. Taken together, these data significantly increase the risk of identity theft or fraudulent loans. Given that ING had millions of individual and corporate clients during the period in question, the potential consequences of such unnecessary data collection were substantial.

Saturday, 28 January 2023

It is your right to know the actual identity of recipients to whom your personal data have been or will be disclosed (C-154/21 Österreichische Post)

The General Data Protection Regulation (GDPR) provides individuals (data subjects) with a number of rights. These are listed in Chapter III of the GDPR and include, inter alia, the right to be informed of the processing of personal data (Articles 13 and 14 of the GDPR), right of access (Article 15 of the GDPR), right to rectification (Article 16 of the GDPR), right to erasure (Article 17 of the GDPR) etc. In mid-January 2023, the Court of Justice in Case C-154/21 Österreichische Post answered a question concerning one of those rights, namely the right of access.

As stated in Article 15(1) of the GDPR „the data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information: […]

(c)  the recipients or categories of recipient to whom the personal data have been or will be disclosed, in particular recipients in third countries or international organisations; […].

The dispute concerned the fact that the data subject requested from the controller the actual identity of the recipients to whom he was disclosing his personal data. However, the controller did not reveal the identity of the recipients, but informed the data subject of the "categories of recipients", indicating that they were „customers, including advertisers trading via mail order and stationary outlets, IT companies, mailing list providers and associations such as charitable organisations, non-governmental organisations (NGOs) or political parties” (para. 20). 

Indeed, doubts arise when applying Article 15(1) of the GDPR in practice. The main question is whether it is necessary to inform about the particular recipients of the data, or would it be enough to notice about general categories of these recipients? Similar doubts arise in the context of Articles 13(1e) and 14(1e) of the GDPR, which oblige the controller, as part of its information obligations performed at the time of data collection, to inform about "the recipients or categories of recipients of the personal data, if any".

In the Court's view, Article 15(1) of the GDPR gives the right to be informed about the specific recipients of personal data and thus to know their actual identity. The Court cites several arguments in this regard:

(1) The data subjects should be guaranteed the right to know and be informed about the processing of their personal data, in particular about the recipients to whom the data are made available. This is emphasised in Recital 63 of the GDPR, which, nota bene, does not refer to the right to information about "categories of recipients of data", but generally to the right to information about "recipients of personal data" (para. 33).

(2) The controller must process personal data in accordance with the principle of transparency, which from the data subject's perspective means that information on how his or her personal data is processed should be easily accessible and comprehensible (para. 35).

(3) „Article 15 of the GDPR lays down a genuine right of access for the data subject, with the result that the data subject must have the option of obtaining either information about the specific recipients to whom the data have been or will be disclosed, where possible, or information about the categories of recipient” (para. 36).

(4) The right of access is often exercised to verify the accuracy of the data or the lawfulness of the processing. In this sense, the right of access frequently determines further actions of the data subject, i.e. the exercise of other rights under the GDPR, e.g. the right to erasure or the right to object to processing. Therefore, the complete and diligent exercise of the right of access is essential to guarantee the effectiveness of the data subject's rights (para. 38).

However, the Court reminded that the right to the protection of personal data is not an absolute right and is subject to limitations. The controller, despite an express request by the data subject, does not have to provide information on the identity of the recipients of the data if "in specific circumstances it is not possible to provide information on specific recipients" (e.g. when it is not possible to identify those recipients - para. 51), and furthermore when the data subject's request is unjustified or excessive in nature [as stated in Article 12(5b) GDPR].

In practice, this means that each request will have to be carefully analysed. It is certainly easier for controllers to provide general information on the categories of recipients rather than precise information on the identity of the recipients. For controllers with large datasets, who share data with many entities and receive many requests of data access, a detailed examination of data flows may be cumbersome. What the judgment lacks, in my view, is a clarification of what the 'special circumstances' that would justify a refusal to disclose the identity of data recipients could consist of. 

It appears from the CJ's reasoning that such a special circumstance may be the lack of knowledge of the future recipients (para. 48). The question is whether such a circumstance could be the difficulty of stating all data recipients due to their large number. In practice, this is a common problem for controllers. Yet, such an interpretation does not seem to be acceptable. It can be said that the Court has spread a protective umbrella over data subjects, obliging controllers to be more accurate, transparent in their processing and to provide reliable and complete information to data subjects. This is a good signal for data subjects, especially consumers of various online services, as the judgment provides clear grounds for demanding detailed information about the processing of personal data. 

Saturday, 26 February 2022

The long-awaited Data Act proposal finally (officially) published

For several years, the European Union has been developing a new digital policy framework that aims to comprehensively regulate the data space in the EU. One of the EU's policy objectives is to make the data generated by humans nad machines, especially in the context of IoT devices, more accessible, thereby unlocking the enormous but still under-used potential of this data. According to the European Strategy for Data released in 2020, this objective is to be achieved, inter alia, through the adoption of a so called Data Act - a regulation on harmonised rules on fair access to and use of data. A leaked version of this act had been circulating on the Internet since the beginning of February, but it was not until 23.02.2022 that it was officially published by the European Commission. 

Although Data Act is mostly focused on business-to-business and business-to-government data sharing, it is also important for consumer protection in the digital environment. As we can read in the proposal’s explanatory memorandum:


a high level of consumer protection is reinforced with the new right to access user generated data in situations previously not covered by Union law. The right to use and dispose of lawfully acquired possessions is reinforced with a right to access data generated from the use of an Internet of Things object. This way, the owner may benefit from a better user experience and a wider range of, for example, repair and maintenance services. In the context of consumer protection, the rights of children as vulnerable consumers deserve specific attention and the rules of the Data Act will contribute to clarity about data access and use situations. [p. 13]


and


The proposal facilitates the portability of the user’s data to third parties and thereby allows for a competitive offer of aftermarket services, as well as broader data-based innovation and the development of products or services unrelated to those initially purchased or subscribed to by the user. [p.13]


Freepik.com
These assumptions are reflected mainly in the Chapter II of the proposal, which introduce a.o: 

  • obligation to make data generated by the use of products or related services accessible (Article 3);
  • the right of users to access and use data generated by the use of products or related services (Article 4);
  • right to share data with third parties (Article 5);
  • obligations of third parties receiving data at the request of the user (Article 6).

The proposal will now be further debated under the legislative path before the European Parliament and the Council. It will certainly be discussed among the scientific community and consumer organisations. The EC proposals, although at first glance reasonable and necessary, require an in-depth analysis in particular from the perspective of already existing data protection and consumer law. Let us just remind that under the GDPR, data subjects have the right of access to their data (Article 15 GDPR) and the right to data portability (Article 20 GDPR). The effective exercise of these rights is sometimes problematic in practice, for example due to the lack of actual control by the controller over data flows or the lack of interoperability between devices/services, making it impossible to transfer data from one provider to another. It is also important to remember that devices that we use every day as consumers may generate not only data containing personal information (and therefore qualifying as personal data), but also non-personal data of a technical nature, containing valuable information about how the devices function or are used by consumers. At the same time, due to the large volumes of data that are produced in IoT devices and services, the differences between personal and non-personal data are increasingly difficult to grasp. For these reasons, the Data Act is a piece of EU legislation that has been long awaited and much anticipated. We can therefore expect the debate surrounding this act to be very lively and interesting.