Tuesday, 2 September 2025

Key GDPR Fines in Mid-2025: Luka (Replika), TikTok, and ING Bank Śląski

This post discusses three recent decisions of Data Protection Authorities imposing fines for GDPR infringements on Luka Inc., TikTok, and ING Bank Śląski. While most of our analyses usually focus on judgments of the Court of Justice of the European Union, in this case we turn to decisions of national authorities. Such decisions tend to attract significant attention, either because of the seriousness of the violations or the high amounts of the penalties, which makes them a frequent subject of debate. The three cases selected meet these criteria and, moreover, were issued within the past few months. They are also directly relevant to consumers, as they highlight risks that many of us face in everyday life when using apps or online services where personal data may be mishandled.
 
 
Luka Inc. 

On 10 April 2025, the Italian Data Protection Authority (Garante per la protezione dei dati personali) issued a decision against Luka Inc., the U.S. company behind the Replika chatbot. Replika is marketed as an AI “companion” designed to boost users’ mood and wellbeing, and can be set up as a friend, mentor, therapist or even a romantic partner. But according to the Garante, the way Luka handled users’ personal data fell far short of what the GDPR requires.

The investigation showed that Replika’s privacy policy did not clearly identify the legal ground for the many different ways in which users’ data were processed – for example, data used for running the chatbot versus data used for developing the large language model behind it. Instead of specifying purposes and corresponding legal bases, the policy only gave vague, generic statements like: “We care about the protection and confidentiality of your data. We therefore only process your data to the extent that: It is necessary to provide the Replika services you are requesting, you have given your consent to the processing, or we are otherwise authorized to do so under the data protection laws” (btw – doesn’t that sound familiar from many privacy policies?). This lack of granularity made it impossible for users to understand how their data were really being used, in breach of Articles 5(1)(a) and 6 GDPR.

What’s more, the privacy notice was only available in English, even though the service was offered in Italy. It also failed to explain key points required under GDPR: what kinds of data were collected, how long they were stored, whether data were transferred outside the EU, and for what purpose. Some statements were even misleading, for instance, suggesting that personal data might be transferred to the U.S., while the company later claimed no such transfers took place. Such gaps and contradictions meant that users could not make informed choices about their data.

However, the most troubling finding was that the Garante concluded Luka had failed to implement effective safeguards for children. Although the service was formally intended for adults, it lacked genuine age-verification mechanisms. Registration required only a name, email address, and gender, which allowed minors to create accounts. Even when users declared they were under 18, no technical barrier prevented them from accessing the platform. In practice, this meant that children could be exposed to age-inappropriate content, including sexually explicit material. Moreover, even after updates to the privacy policy, technical testing showed that under-18 users could still bypass the age restriction simply by editing their profile. 
 
For these violations, the Garante imposed an administrative fine of EUR 5,000,000, representing half of the maximum amount available under Article 83(5) GDPR.
 
 
TikTok Technology Limited
 
Another significant decision was issued by the Irish Data Protection Commission (DPC) in May 2025 against TikTok Technology Limited. Although the full text of the decision has not yet been published, the official press release provides insight into the reasons for the sanction.
 
The inquiry examined both the lawfulness of TikTok’s transfers of European users’ personal data to China and the adequacy of the company’s transparency regarding those transfers. The DPC concluded that TikTok had infringed the GDPR in two key respects.
 
First, the Commission found that TikTok’s transfers of user data to China violated Article 46(1) GDPR. The company failed to verify, guarantee, and demonstrate that personal data of European users – remotely accessed by staff in China – was afforded a level of protection essentially equivalent to that required within the EU. TikTok’s own assessments of Chinese law highlighted serious divergences from EU standards, particularly risks under the Anti-Terrorism Law, the Counter-Espionage Law, and the National Intelligence Law. Nevertheless, the company did not adequately address these risks or ensure that its contractual safeguards were effective.
 
Second, the DPC held that TikTok had not complied with the information duties set out in Article 13(1)(f) GDPR. Earlier versions of its privacy policy (in force between July 2020 and December 2022) failed to identify the countries involved in data transfers and did not explain the nature of the processing – for instance, that personnel in China could remotely access data stored in Singapore and the United States. This lack of clarity prevented users from understanding who could access their data and under what conditions.
 
The decision imposed not only administrative fines but also corrective measures. TikTok was given six months to bring its practices into compliance, failing which data transfers to China would have to be suspended altogether. The total fine amounted to EUR 530,000,000, comprising EUR 485,000,000 for the unlawful transfers and EUR 45,000,000 for the lack of transparency.
 
 
ING Bank Śląski
 
The third discussed decision was delivered on 23 July 2025 by the Polish Data Protection Authority (UODO) against ING Bank Śląski S.A., which was fined PLN 18,416,400 (around EUR 4,000,000). The case revolved around the bank’s widespread practice of copying and scanning ID cards of both existing and potential clients, even in situations where such a step was not required by law. The bank introduced this practice after the amendment of Polish anti-money laundering provisions, interpreting them as justifying the systematic copying of IDs.
 
The investigation revealed that between April 2019 and September 2020 the bank systematically scanned ID documents not only during customer onboarding, but also in contexts where no anti-money laundering (AML) obligations applied – for example, when a customer filed a complaint about an ATM. In practice, the bank’s internal procedures made the delivery of services conditional on handing over a scanned ID, leaving consumers with no real choice.
 
As emphasized in the decision, both AML law and the GDPR require banks to conduct a risk-based assessment and determine, case by case, whether copying an ID is genuinely necessary. ING failed to perform such assessments. Instead, it adopted blanket rules requiring ID copies in numerous situations, regardless of whether AML obligations applied. As a result, the bank processed extensive amounts of sensitive identifying information without a valid legal basis under Article 6 GDPR. Although no specific harm was demonstrated, the decision underscores that ID cards contain a wide range of personal data – including full name, date of birth, parents’ names, unique national ID number (PESEL), photograph, and document series. Taken together, these data significantly increase the risk of identity theft or fraudulent loans. Given that ING had millions of individual and corporate clients during the period in question, the potential consequences of such unnecessary data collection were substantial.