Showing posts with label data protection. Show all posts
Showing posts with label data protection. Show all posts

Sunday, 6 April 2025

Do you really need my title? The CJEU says no – a win for consumer privacy in case C‑394/23

(Source: Freepik)

Have you ever been asked about your title while purchasing something online? It’s a common practice, but most of us (consumers) don’t realise that it raises concerns from a data protection perspective, especially when the seller requires us to provide this information and does not allow us to skip the form field and place the order without disclosing our gender. This practice was challenged by the French association Mousse in proceedings against the French Data Protection Authority (Commission nationale de l’informatique et des libertés, CNIL) and the French railway operator SNCF Connect, eventually resulting in a preliminary ruling by the Court of Justice of the EU (Case C‑394/23).

The facts

SNCF Connect sells rail travel documents such as train tickets and discount cards via its website and mobile applications. When purchasing these products, customers are required to indicate their title by selecting either Monsieur (Mr) or Madame (Ms). This requirement raised Mousse’s concerns about its compliance with the General Data Protection Regulation (GDPR).

The association filed a complaint with CNIL, arguing that the collection of titles lacked a valid legal basis under Article 6(1) GDPR, violated the data minimisation principle under Article 5(1c), and failed to meet the transparency and information obligations set out in Article 13 GDPR. The CNIL rejected the complaint, concluding that collecting titles was justified as necessary for the performance of a contract under Article 6(1b) and aligned with accepted norms of personalised communication (paras. 13–15). Mousse appealed the decision to the French Conseil d’État, which referred several preliminary questions to the CJ.

The ruling

The Court of Justice essentially said “no” to this kind of data processing. It did not declare that the processing of title-related personal data is categorically prohibited under the GDPR, but stressed that in the specific context of this case, it “does not appear to be either objectively indispensable or essential to enable the proper performance of the contract” concluded with the consumer (para. 39).

Here are the key takeaways from the judgment:

1. The Court focused its analysis on Articles 6(1b) and 6(1f) GDPR, which establish when data processing is lawful. Article 6(1b) allows processing when it is “necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract”, while Article 6(1f) permits it if it serves a legitimate interest of a controller or a third party, provided that interest is not overridden by the data subject’s fundamental rights and freedoms.

The Court made it clear that when relying on contractual necessity under Article 6(1b), the controller must show that the processing is “objectively indispensable for a purpose that is integral to the contractual obligation intended for the data subject” (para. 33). In other words, the controller must demonstrate that the processing “must be essential for the proper performance of the contract concluded between the controller and the data subject and, therefore, that there are no workable, less intrusive alternatives” (para. 34). Applying this to the case at hand, the Court rejected the CNIL’s and SNCF’s claim that collecting customers’ titles is necessary for personalised commercial communication, and that such communication is an essential part of the contract. According to the Court:

“Commercial communication may constitute a purpose forming an integral part of the contractual service concerned, since the provision of such a rail transport service involves, in principle, communicating with the customer in order, inter alia, to send him or her a travel document by electronic means, to inform him or her of any changes affecting the corresponding journey, and to allow exchanges with the after-sales service. That communication may require adherence to accepted practices and may include, in particular, forms of addressing a customer, in order to show that the undertaking concerned respects its customer and thereby to safeguard that undertaking’s brand image. However, it appears that such communication does not necessarily have to be personalised based on the gender identity of the customer concerned” (paras. 37–38).

In short, personalising content is not necessary if the same service can be provided in a standard, non-personalised way. The controller could instead use more privacy-friendly alternatives, such as generic and inclusive forms of address that do not rely on the consumer’s assumed gender identity (para. 40).

2. Furthermore, the systematic and generalized processing of consumers’ titles cannot be justified by the mere fact that some of them use the services of night trains, even if it is necessary to adapt transport services for night trains, which have carriages reserved for persons with the same gender identity, and to assist passengers with disabilities. In the Court's view, it does not justify the collection of titles of all customers, including those who travel during the daytime or who do not have disabilities. Such a practice is disproportionate and contrary to the principle of data minimization (para. 42).

3. As it regards the ‘legitimate purposes’ prerequisite, the Court found that personalised commercial communication can be achieved by using customers’ first and last names alone, since requiring their title or gender identity is not strictly necessary, particularly in light of the data minimisation principle (para. 55). Moreover, it’s important to note that Article 6(1f) GDPR does not allow “common practices or social conventions” to justify the necessity of processing personal data (para. 56).

4. Finally, the fact that data subjects may object to the processing under Article 21 GDPR is irrelevant in this context. According to the Court, this opt-out mechanism should not be taken into the account while assessing whether the original data collection was lawful (para. 70). To put it simply, controllers cannot justify collecting unnecessary personal data by simply allowing individuals to object afterward. While the right to object is an important safeguard, it does not give controllers a free pass to collect data first and handle objections later.

Our comment

The judgment has a direct impact on the practices of certain data controllers who, without a valid legal basis, collect excessive data concerning consumers’ titles and gender identity, where such information is not necessary for the purposes of processing. The CJ ruling serves as a clear reminder that personal data must be processed in accordance with the principle of data minimisation, meaning that only data strictly necessary to achieve the intended purpose should be collected and used.

Importantly, the Court did not declare that the collection of such data is absolutely prohibited under the GDPR. Rather, it emphasised that lawfulness depends on the specific context. For example – although not stated explicitly, this can be inferred from the reasoning – a controller may process such data on the basis of the data subject’s consent. In that case, a form used by the consumer to conclude a contract could include an optional field allowing the individual to indicate a preferred form of address. Crucially, this field would not be mandatory: if the consumer wished to provide that information, they could do so; if not, they could simply skip it without consequence. 


PS. In the context of this judgment, it is also worth drawing attention to another recent CJEU decision (case C‑247/23) which likewise concerned the processing of gender identity data. In that case, the Court reaffirmed that one of the fundamental duties of a data controller is to ensure the accuracy of the personal data processed. If a data subject exercises its right to rectification, the controller should not impose disproportionate administrative burdens that unjustifiably hinder the exercise of that right. The case involved a request to update the gender information in a public register maintained by a Hungarian authority. The individual, registered as female, sought to have the record amended to reflect his male gender, submitting medical documentation to support the request. The authority, however, demanded proof of surgical gender reassignment – a requirement the CJ found excessive and incompatible with the essence of fundamental rights, including the rights to personal integrity and respect for private life.

Tuesday, 2 April 2024

How the CJEU's ruling in C-604/22 may transform online advertising: a closer look at the IAB Europe case

In March, the CJEU issued a ruling (Case C-604/22 IAB Europe) that has sparked a lot of discussion. The ruling addresses certain practices related to online advertising in Europe, particularly the collection of personal data for the purpose of behavioural advertising.

Facts of the case

The Interactive Advertising Bureau Europe (IAB Europe) is a non-profit association that represents digital advertising and marketing businesses at the European level. IAB Europe's members include companies that generate significant revenue by selling advertising space on websites or applications. Several years ago the association developed the Transparency & Consent Framework (TCF) to promote General Data Protection Regulation (GDPR) compliance when using the OpenRTB protocol (a popular system used for "real-time bidding", which means it quickly and automatically auctions off user information to buy and sell ad space on the internet). The TCF consists of guidelines, technical specifications, instructions, protocols, and contractual obligations. The framework is designed to ensure that when users access a website or application containing advertising space, technology businesses representing thousands of advertisers can instantly bid for that space using algorithms to display targeted advertising tailored to the individual's profile.
Image by "storyset" (Freepik)

The TCF was presented as a solution to bring the auction system into compliance with GDPR (para. 21, 22). However, before displaying targeted advertisements, the user's prior consent must be obtained. When a user visits a website or application, a Consent Management Platform (CMP) appears in a pop-up window. The CMP enables users to give their consent to collect and process their personal data for pre-defined purposes, such as marketing or advertising, or to object to various types of data processing or sharing of data based on legitimate interests claimed by providers, as per Article 6(1f) of the GDPR. The personal data relates to the user's location, age, search history, and recent purchase history (para. 24). In other words - the TCF facilitates the capture of user preferences through the CMP. And these preferences are coded and stored in a "TC string" (which is a combination of letters and characters), and then shared with organizations participating in the OpenRTB system, indicating what the user has consented/ objected to. The CMP places a cookie on the user's device, and when combined with the TC string, the IP address of the user can identify the author of the preferences. Thus the TCF plays a crucial role in the architecture of the OpenRTB system as it is the expression of users' preferences regarding potential vendors and various processing purposes, including the offering of tailor-made advertisements (para. 25, 26).

Since 2019, the TCF model has faced numerous complaints to the Belgian Data Protection Authority (DPA) regarding its GDPR compliance. IAB Europe was criticized for providing users with information through the CMP interface that was too generic and vague, preventing users from fully understanding the nature and scope of data processing and thereby maintaining control over their personal data. Furthermore, IAB Europe was accused of failing to fulfil certain obligations of a data controller, including ensuring the lawfulness of processing, accountability, security, and adhering to data protection privacy by design and by default rules (more details about the proceedings can be found on the DPA's website). Consequently, the DPA concluded that IAB Europe did not meet its GDPR obligations and imposed an administrative fine of €250,000. Additionally, it mandated corrective actions to align the TCF with GDPR standards. 

IAB Europe disagreed with the decision and challenged it before the Belgian court. According to IAB Europe, it should not be considered a data controller for recording the consent signal, objection, and preferences of individual users through a TC string. Thus the association should not be obliged to follow data controllers' obligations under GDPR. IAB Europe also disagreed with the DPA's finding that the TC string is personal data within the meaning of Article 4(1) of the GDPR. Specifically, IAB Europe argued that only the other participants in the TCF could combine the TC String with an IP address to convert it into personal data, that the TC String is not specific to a user and that IAB Europe cannot access the data processed in that context by its members (para. 28).

CJ's ruling


The Court has confirmed the key aspects of the DPA’s decision, emphasizing, among other things that:


1. the TC String holds information that pertains to an identifiable user and, thus, qualifies as personal data under Article 4(1) of the GDPR. Even if it doesn't contain any direct factors that allow the data subject to be identified, it does contain the preferences of a specific user relating to their consent to data processing. This information is considered to be related to a natural person (para. 43). If the information in a TC String is linked to an identifier, such as the IP address of the device, it could be possible to create a profile of that user and identify a particular person (para. 44). The fact that IAB Europe cannot combine the TC String with the IP address of a user's device and doesn't have direct access to the data processed by its members is irrelevant. As the Court stated, IAB Europe can require its members to provide it with the necessary information to identify the users whose data is being processed in a TC String (para. 48). This means that IAB Europe has reasonable means to identify a particular natural person from a TC String (para. 49).

2. IAB Europe, together with its members, is considered a 'joint controller' when it determines the purposes and ways of data processing. Why? According to the Court, the TCF framework aims to ensure that the processing of personal data by certain operators that participate in the online auctioning of advertising space complies with the GDPR. Consequently, it aims to promote and allow the sale and purchase of advertising space on the Internet by such operators. It means that IAB Europe has control over the personal data processing operations for its own purposes and, jointly with its members, determines the purposes of such operations (para. 62-64). Moreover, the TCF contains technical specifications relating to the processing of the TC String, such as how CMPs need to collect users' preferences, how such preferences must be processed to generate a TC String, etc. (para. 66). If any of IAB's members do not comply with the TCF rules, IAB Europe may adopt a non-compliance and suspension decision, which could result in the exclusion of that member from the TCF (para. 65). Therefore, the Court concluded that IAB Europe also determines the means of data processing operations jointly with its members (para. 68), so it meets the criteria of a data controller under Article 4(7) of the GDPR. However, this should not automatically make IAB Europe responsible for the subsequent processing of personal data carried out by operators and third parties based on information about the users' preferences recorded in a TC String (para. 74-76).

What could be the consequences of the ruling? 

The Court confirmed that the IAB Europe, due to the role and significant influence it has over the processing of data by its members for the purposes of creating user profiles and targeting them with personalized advertising, should be held responsible for how this process is organized. And it is organized in a way that is hardly transparent to users. While it is up to the national court to ultimately examine the compatibility of the Belgian DPA's decision, it can be expected that the court will affirm the main conclusions of the Belgian authority's decision. 

It appears unlikely that the CJ's ruling will lead to the elimination of the intrusive pop-ups on many websites, which often rely on dark patterns and manipulative techniques to coerce consent for data processing for marketing purposes. Nevertheless, the advertising industry should place a greater emphasis on enhancing transparency and providing users with more control over their personal data. This could include the development of more user-friendly and informative consent mechanisms, making it easier for users to understand what they are consenting to and how to exercise their rights over their data. The ruling is also expected to impose further restrictions on behavioural advertising practices, particularly those dependent on real-time bidding and the widespread sharing of personal data without explicit, informed consent from users. 

Friday, 3 December 2021

Consumer organisations may bring proceedings to defend collective interests of consumers based on the GDPR, if national law so states: AG opinion in C-319/20, Facebook Ireland

Yesterday the Advocate-General Richard de la Tour delivered his opinion in case C-319/20, Facebook Ireland, considering whether consumer organisations can have a standing to bring judicial proceedings against infringements of the General Data Protection Regulation 2016/679, independently of actual infringements of data subjects' rights. Arguably, the importance of the case goes beyond the procedural dimension it involves (not least due to Directive 2020/1828 on representative actions which elaborates on the enforcement framework, including for the GDPR). In the expert report published by BEUC earlier this year, the case was highlighted as a possible "game changer" concerning the relation between consumer and data protection law (see also: New study on consumer protection in the digital age...). The direction of the AG's opinion is likely to be welcomed in the consumer protection community.

Facts of the case

The case involves a number of data processing practices identified by the German federation of consumer organisations (vzvb) on the Facebook platform back in 2012. Most notably, the federation argued that information about the processing of personal data in connection with third-party apps available in Facebook's App Centre failed to meet the appliable requirements. German courts generally agreed that the vzvb had a point on the merits. However, following the entry into force of the GDPR a doubt was raised if the federation continued to have standing in cases that involved violations of data subjects' rights, independently of specific infringements.

Opinion of the AG 

Standing of consumer organisations

The problem sounds familiar? That's because it is. A similar question was considered by the CJEU in 2019, in the context of the previously applicable Data Protection Directive (FashionID case). Back then the Court rejected an argument that consumer organisations should not be entitled to bring claims under data protection rules. According to the AG, this has not changed after the entry into force of the GDPR; quite the contrary, the regulation explicitly provides for collective redress and nothing in Article 80(2) of the act implies that an organisation can only bring proceedings if particular persons affected by the processing have been identified.

The conclusion reached by the AG in respect of the GDPR appears to be well-founded. The reasoning relies on both literal, systematic and teleological interpretation. The AG refers both to the definition of parties entitled to bring representative actions under Article 80 of the GDPR. According to the AG, that definition extends to "all entities which pursue an objective in the public interest that is connected with the protection of personal data", which also applies to consumer protection associations (para. 61). As regards further conditions for bringing representative actions, the AG found it sufficient for an entity to demonstrate "an infringement of the provisions of Regulation 2016/679 designed to protect the subjective rights of data subjects", without the necessity to verify if the rights of one or more specific persons have been infringed (para. 63). In addition, arguments concerning the effectiveness of the GDPR, its consistency with Directive 2020/1828, and a high level of protection of personal data have been cited.

Two broader points

Aside from the above, two further aspects of the opinion merit attention. Firstly, the AG considers the "particular characteristics" of the GDPR as a regulation and connects it to discussions on full harmonisation. The AG notes that while the GDPR "seems, at first sight, to tend towards full harmonisation ... the truth is more complex" (paras. 50-51). According to the AG:

"[T]he legal basis of Regulation 2016/679, namely Article 16 TFEU, precludes the view that in adopting that regulation the European Union would have pre-empted all the ramifications which the protection of personal data may have in other areas relating, in particular, to employment law, competition law or even consumer law, by depriving Member States of the possibility of adopting specific rules in those areas, more or less independently, depending on whether the area in question is governed by EU law. In that sense, although the protection of personal data is by nature cross-sectoral, the harmonisation implemented by Regulation 2016/679 is limited to the aspects specifically covered by that regulation in that area. Apart from those aspects, the Member States remain free to legislate, provided that they do not undermine the content and the objectives of that regulation." (para. 51)

One can wonder to what extent the above finding depends on the legal basis chosen. This is particularly important in the context of the ongoing legislative developments at EU level which equally take form of regulations, but are also based on Article 114 TFUE. A prominent case in point is the proposed Artificial Intelligence Act and the more recent proposal on political targeting. Arguably, doubts about the Member States' discretion can best be resolved by way of careful drafting that makes adequate use of 'opening clauses'.

Secondly, the opinion touches upon the broader relationship between consumer and data protection law. The AG admits that "unlike ... in the United States of America, in EU law the regulations relating to unfair commercial practices and those relating to the protection of personal data have developed separately" and "are thus the subject of different regulatory frameworks" (para. 79). The opinion further observes that unlike EU consumer law, the GDPR "is not based on a consumerist concept of the protection of natural persons in relation to the processing of personal data, but on the concept that that protection is ... a fundamental right" (para 82). A number of important connections between consumer and data protection law are nonetheless recognized, as illustrated below:

"[T]here is some interaction between the two areas, so that actions falling within the framework of the regulations relating to the protection of personal data may, at the same time and indirectly, contribute to putting an end to an unfair commercial practice. The opposite is also true." (para. 80)  

"[I]n the age of the digital economy, data subjects often have the capacity of consumers. It is for that reason that the rules designed to protect consumers are often relied on to ensure that consumers are protected against a processing of their personal data that is contrary to the provisions of Regulation 2016/679." (para. 83)

and finally

[T]here may be an overlap between the representative action provided for in Article 80(2) of Regulation 2016/679 and that provided for in Directive 2020/1828 in order to obtain injunctive relief when ‘data subjects’, within the meaning of that regulation, also have the capacity of ‘consumer’, within the meaning of Article 3(1) of that directive. I see there the sign of complementarity and convergence of the law relating to the protection of personal data with other areas of law, such as consumer law and competition law. With the adoption of that directive, the EU legislature went even further and expressly linked the protection of the collective interests of consumers with compliance with Regulation 2016/679. The effective application of the rules contained in that regulation cannot but be strengthened as a result." (para. 83)

Concluding thought

Overall, the AG not only speaks out in favour of consumer organisations' standing in cases involving data protection violations, but also supports a close relationship between consumer and data protection law. Arguably, both fields can also be aligned conceptually and, indeed, complement each other in the attainment of a high level of consumer and data protection. A judgment endorsing the AG's point of view would thus be very welcome.

Monday, 22 November 2021

European Data Protection authorities speak up on targeted advertisement

 Dear readers, 

this is a teaching-intensive autumn across European universities - with all the excitement, uncertainty and overall strains of being mostly back in class after over a year of mostly living room lecturing. 

This, however, should not mean that we let crucial developments go unnoticed: last week, in fact, the European Data Protection Board (EDPB) has issued its most resolved opinion yet on the matter of privacy and behavioural tracking. Cookies, in other words - a staple not only of many people's secret kitchen stashes but also of equally elusive locations on our devices. 

The occasion for issuing this opinion is commenting on the Commission's Digital Services Act, which according to the Board should be brought more clearly in line with data protection rules. Couched among guidelines and standpoints on a number of highly salient issues - from counterterrorism to face recognition AI - the EDBP has called for 

1) considering a phase-out of targeted ads based on "pervasive tracking";

2) in any event, prohibiting targeted ads addressed at children.   

The opinion does not expand on the reasons for such standpoint, but mainly refers to previous positions  contained in comments on the DSA by the European Data Protection Supervisor (EDPS) and the European Parliament. In fact, criticism of the current rules' focus on informed consent has been around at least for the better part of the past decade (see for a classic Frederik Borgesius). 

The European data protection board is composed of representatives from the national data protection authorities. As a collective body mirroring positions in the Member States, its position can perhaps have more sway than the occasionally more principled stances of the EDPS. 

Saturday, 18 September 2021

A look back at consumer data protection - what has happened this year?

Today we rush to provide a brief summary of interesting developments at the interface of consumer law and data protection that have taken place since the beginning of this year. A lot has happened and so far we have only reported on selected cases. In this post we take a closer look at significant decisions and opinions issued this year, which will likely see continuation in the near future. 

We begin with a fairly recent decision adopted by the Irish Data Protection Commission (DPC) in the WhatsApp case. The decision captured headlines earlier this month - and not without a reason. After all, it is not every day that Facebook (as the parent company of WhatsApp) is hit with a fine of €225 million for its violations of the General Data Protection Regulation. The proceedings are interesting for both material and procedural reasons. Firstly, the ultimate decision reached by the DPC provides an extensive analysis of the GDPR’s transparency provisions, as applied to the case at hand. For a start, the decision points to the “over-supply of very high level, generalised information” and the possibility of creating an “information fatigue” - a well-known issue in consumer law and policy. Aside from problems of volume and presentation (e.g. multiple cross-references), violations of substantive transparency are also identified (e.g. not sufficiently specific categories of data recipients). Secondly, the proceedings reveal the relevance of the GDPR’s consistency mechanism in cases involving cross-border data processing (for related controversies and case law see our comment here). In the WhatsApp case, the Data Protection Commission was acting as the lead authority - and its original draft decision had not been quite as sweeping as the one eventually adopted. It was the interventions of the other supervisory authorities and the binding

decision adopted by the European Data Protection Board in late July that led the DPC to take a harsher stance - in relation to both identified infringements and calculated fines. Facebook has reportedly challenged the decision before a competent court, so stay tuned!


An even greater fine was imposed in July on Amazon; the Luxembourg data protection authority fined the company with €746 million (record-breaking so far) for illegal ad targeting and ordered the company to review its practices. As La Quadrature du Net (French privacy rights group that issued a complaint) explains, Amazon was targeting data subjects for advertising purposes without their freely given consent and was therefore processing consumer data without a legal basis. Unfortunately, the Luxembourg authority did not publish the content of the decision due to professional secrecy. According to a published statement, the authority views decision publication as an extra sanction. To be sure, the decision may ultimately see the light of day, yet only after all legal remedies have been exhausted and the publication is not likely to cause disproportionate harm to the parties involved. 


Another interesting case - this time at the online/offline interface - has been handled by the Swedish supervisory authority, which reviewed the activities of a public transport operator in Stockholm. The case involved ticket inspectors equipped with video and audio recording cameras worn on their clothing. The use of this type of technology was intended to counter possible dangerous events during ticket checks, documenting incidents and ensuring that the right person is punished for travelling without a ticket. The problem was that the controllers had to have cameras on throughout their shifts and could thus potentially record every traveller. In addition, images and sounds were overwritten in the cameras only every minute. The authority considered this period disproportionately long and maintained that the storage time should be reduced to a maximum of 15 seconds. While this case involved recording undertaken by a trader’s employees, one can well imagine similar problems being posed in the context of increasingly sophisticated smart devices carried by consumers themselves. Recently unveiled Facebook glasses (Rayban Stories) provide a prominent case in point - and the competent authorities have already voiced concerns


Last but not least, as regards the decisions of national authorities, TikTok's sorrowful saga of alleged violations of children’s privacy continues to unfold. Previously it was the Italian data protection authority that ordered the platform to limit the processing of users’ data whose age could not be established with certainty (we informed about it here). This time the Dutch authority investigated TikTok’s privacy policy and concluded that the company had failed to inform its users, including children, in a clear and transparent way how their data is processed via app. More specifically, the language of the information remained of relevance, as the information was provided only in English, and not in Dutch. In consequence, the authority imposed (a comparably modest) fine of €750,000. This, however, may not be the end of TikTok’s legal problems, as the launch of two further inquiries by the Irish DPC demonstrates.


Finally, there are two additional pieces of news we consider noteworthy. The first, which has probably caught our readers' attention, is the Commission's decision on data transfers between the EU and the UK, issued on 28th of June. The decision is a direct follow-up of Brexit and confirms that the UK guarantees an essentially equivalent level of data protection to that which is provided in the EU. However, we are curious to see how these data transfers will develop in the future, especially in the context of recent reports about planned changes to the UK data protection law (we briefly touched upon this here). The second matter is a joint opinion of the European Data Protection Board and the European Data Protection Supervisor concerning the proposal for a regulation laying down harmonised rules on artificial intelligence. The opinion highlights several key points that need further elaboration (e.g. clarifications on a risk-based approach and the EDPS’ role as a market surveillance authority) and calls for “a general ban on any use of AI for an automated recognition of human features in publicly accessible spaces”. 


** Agnieszka Jabłonowska contributed to this post.

Thursday, 26 August 2021

UK using their right to forget the right to privacy again?

UK's government named Mr John Edwards as the new Information Commissioner (ICO). His task: to move away from the EU data protection rules, which are at least to an extent perceived as 'pointless' (see BBC's news 'Data protection 'shake-up' takes aim at cookie pop-ups'). For example, the pop-up notices that a website is using cookies could only be required in the future when the website brings about 'high risk' to privacy. How the 'high risk' will be determined? - remains to be seen. Generally, privacy is to be protected through 'a light touch' though, which likely means that the new UK rules will not be compatible with GDPR rules. This, in turn, may inhibit trade with EU countries (if UK is recognised as a country deviating too much from GDPR rules to guarantee safe transfer of data), which could not be worth creating the regulatory haven the UK government is dreaming about. But then again, the right to privacy was never perceived as a human right in the UK and it seems Brexit could give an excuse to strip it away again.

Sunday, 27 June 2021

One-stop-shop mechanism of the GDPR clarified by the Court of Justice (case C-645/19 Facebook Ireland)

Last week the Court of Justice delivered an important judgment in case C-645/19 Facebook Ireland. The case offered an opportunity to clarify procedural aspects of the General Data Protection Regulation 2016/679. In particular, it involved topical problems related to the one-stop-shop mechanism provided for in case of cross-border data processing. The GDPR assigns the role of a "lead authority" in this context to the supervisory authority of the main establishment or of the single establishment of the relevant controller or processor. Since many digital companies undertaking large-scale data processing in the EU have their main establishments in Ireland, it is the Irish Data Protection Commissioner that appears to be a lead authority in respective cases. Over past years, however, the authority has come under strong criticism for failing to effectively act on complaints brought before it and particulars of the one-stop-shop mechanism have been a subject of debate (see e.g. our previous posts €50m fine imposed on Google..., Further updates on consumer protection..., BEUC files complaints against Tik Tok). In case C-645/19 Facebook Ireland, the Court addressed some of the issues raised, clarifying when authorities of other Member States are competent to exercise their powers.
 
Facts of the case
 
The background of the case was primarily procedural. The Belgian authority brought a case before a Belgian court against Facebook Ireland, Facebook Inc. and Facebook Belgium, with the aim to bring an end to the collection of information on the internet browsing behaviour of Facebook users and non-users by means of cookies, social plug-ins, pixels, etc. The court of first instance considered itself competent to give a ruling and confirmed the alleged infringements. The trader brought an appeal against the judgment and a question was raised whether the Belgian supervisory authority had the required standing and interest to bring proceedings in the first place, and if so, in relation to which violations (e.g. committed by Facebook Inc., Facebook Belgium, and/or Facebook Ireland; before and/or after 25 May 2018, that is the date on which the GDPR and its one-stop-shop mechanism became applicable). 

Judgment of the Court
  
Since the judgment is rather technical, the present post does not aspire to provide its comprehensive overview. An interested reader is rather advised to consult the judgment directly. In this post we will rather pick up on selected points, partly in a different order from the one adopted by the Court.
 
Firstly, the Court engaged with the argument put forward by the platform operator that the legal action concerning the facts precedings 25 May 2018 was inadmissible, given that the previously applicable provisions of Belgian law were repealed following the entry into force of the GDPR. The Court addressed this problem from the perspective of EU law, finding that a supervisory authority which brought an action related to cross-border processing taking place before 25 May 2018 may continue to pursue such an action on the basis of the previously applicable Directive 95/46 (para. 105). Put differently, the one-stop-shop mechanism established in the GDPR does not stand in way of the proceedings by different DPAs in relation to violations preceding the GDPR's date of application. The Court, however, did not engage with the interpretation of previously applicable Directive 95/46, and the question whether its provisions on supervisory authorities could be deemed to have direct effect. By contrast, such an effect was confirmed in relation to the relevant provisions of the GDPR (para. 113).

Arguably, the most interesting part of the judgment concerns the one-stop-shop mechanism itself (the first question). This is where the judgment gets particularly technical, the reasoning is intertwined with extensive references to GDPR provisions and appears to often change direction. Ultimately, para. 71 and the following deserve particular attention. Here the Court finds that the exercise of the power by a non-lead authority to bring actions before the courts of its state cannot be ruled out in the following situations. Firstly, this is the case when the mutual assistance of the lead supervisory authority had been sought under Article 61 of the GDPR and the lead authority did not provide the other authority with the requested information. Secondly, under Article 64(2) of the GDPR, a supervisory authority may request that any matter that is of general application or that produces effects in more than one Member State be examined by the European Data Protection Board with a view to obtaining an opinion, in particular where a competent supervisory authority does not comply with the obligations for mutual assistance imposed on it by Article 61 of the GDPR. Following the relevant procedure (that is, if the EDPB approves), the supervisory authority should be able to exercise the power conferred on it by Article 58(5) of the GDPR and take the necessary measures to ensure compliance with the GDPR.

The remaining part of the judgment involved the potential additional prerequisites for the exercise of the power by a national authority other than lead authority in the above described cases; specifically, whether the actions of such non-lead DPAs should be limited to the controllers having a main establishment or another establishment on their territory. The Court looked at this problem from a twofold perspective and opted for a reading that does not significantly restrict the powers of such non-lead authorities (paras. 84 and 96). Put differently, it remains theoretically possible that a non-lead Belgian authority initiates or engages in legal proceedings against a company like Facebook Inc.
 

Thursday, 1 April 2021

From Open Banking to Open Finance? UK developments

Following the possibility provided by PSD2 of opening up banking data for third-party access, the UK's Competition and Markets Authority instructed the 9 largest banks to do so at the request of the consumer. This initiative called ‘Open Banking' enables consumers to share their data with third-party providers that can use the data to develop innovative financial products and to provide ‘traditional’ services in a more convenient way (see our post here).

Three years post this experiment and building on the experience gained from Open Banking, the UK’s Financial Conduct Authority may be ready to take the next step to ‘Open Finance’. The FCA published a Call for Input in December 2019 on the shift to Open Finance and following the receipt of a large number of responses from industry representatives it published its Feedback Statement in March 2021.

Open Finance refers to the extension of Open Banking-like data sharing to a wider range of financial products, such as savings, investments, pensions, and insurance. Open finance enables consumers and SMEs to access and share their data with third-party providers outside banking, which can then use that data to develop innovative products and services which meet consumers’ current and future needs. The idea is greater ownership and control of consumers and SME's over their data.

Open Finance could potentially offer significant benefits to consumers, including increased competition, improved financial advice, and improved access to a wider and more innovative range of financial products and services. The FCA also sees this as an opportunity to enable access to creditworthy but previously excluded consumers to financial services such as credit.

In addition to consumers, SME’s would also benefit from open finance, by improved integration of payment, accounting, and lending platforms for internal management, leading to greater cash flow control. SME's could better compare the products and services available to them from a range of providers, and through this avenue, they would benefit from greater access to commercial lending. 

Overall, both consumers and SME's could gain access to a wider range of products and services, have greater control over their data, and could potentially better engage with their finances.

However, Open Finance would create or increase risks and raise new questions especially around data protection and digital identity. However, the FCA is also concerned that Open Finance would create or increase risks and raise new questions especially around data protection and digital identity. Further digitalization might deepen the exclusion of some consumer groups such as those that are less digitally savvy and greater responsibility might result in a choice of inappropriate or even dangerous financial products. The initiative will therefore have to be followed by an appropriate legislative and regulatory framework, common standards, and a designation of an implementation entity, as for Open Banking.

There is no doubt Open Finance could be highly beneficial for consumers, however, with greater control comes greater responsibility, and the big question from a consumer protection perspective is, are consumers ready to take this on?


Wednesday, 27 January 2021

Further updates on consumer protection in the digital economy: TikTok, Google and Apple in the spotlight

Yesterday we reported on some encouraging news about online consumer protection coming from Norway. Today we would like to follow up on these reports and briefly review other developments relevant to consumer protection in the digital economy, which caught our attention over past months. All of these developments show how consumer law, competition law and data protection law are all relevant to the protection of consumer interests vis-a-vis major online platforms.

Our readers may have heard about the recent decision of the Italian data protection authority, Garante, ordering Tik Tok to immediately limit the processing of personal data with regard to users whose age could not be established with certainty. The action was taken as a matter of urgency, following a death of a young girl in Palermo who took part in a "black-out" challenge that spread across the platform. While this story understandably captured public attention, it is worth noting that a formal proceeding against TikTok was initiated by the Garante already at the end of last year.

Earlier this week Reuters reported that the search engine giant Google may be facing yet another antitrust probe from the European Commission, this time in relation to its advertising practices. Under examination are among others the integration of DoubleClick (Google's ad serving unit) and the company's plan to phase out third-party cookies on Chrome. Several weeks ago the British Competition and Markets Authority opened an investigation on the same subject. According to Google, greater concentration of its advertising ecosystem (aka "Privacy Sandbox") is supposed to better protect consumers’ privacy (which, of course, remains to be verified, particularly in light of "dark patterns" employed by the trader), yet authorities fear it they may negatively impact other interests (here: commercial interests of the publishers, although one could also think of consumer interests other than privacy). As the CMA notes, the challenge faced by regulators is "to address legitimate privacy concerns without distorting competition". It is worth recalling that online advertising is also part of the proposed Digital Services Act, which envisages e.g. an obligation of very large online platforms to publish ad repositories with information about targeting.

Last but not least, back in November, the data protection organisation NOYB filed a complaint with the Spanish and German data protection authorities against Apple's Identifier for Advertisers, arguing that it allowed for user tracking without consent. Interestingly, the organization - co-founded by the activist Max Schrems - chose to rely on the E-Privacy Directive instead of the GDPR, to avoid "endless procedures" of cooperation between DPAs. Indeed, some of the high-profile cases under the GDPR are still ongoing, including an inquiry into Google’s processing of location data, triggered by - you guessed it - a report by the Norwegian Consumer Council.

Tuesday, 26 January 2021

Norwegian Consumer Council - sheriff of online consumer protection

The Norwegian Consumer Council (Forbrukerrådet) has published two interesting news reports this month. 
 
First, on Jan 14 it has reported on potentially unfair commercial practices of Amazon, which make it difficult for consumers to cancel their Amazon Prime subscription (You can log out, but you can never leave). The Norwegian Consumer Council identified many of these practices as Amazon using dark patterns to manipulate consumers online, hindering them in making informed choices, trying to nudge them away from actually cancelling the subscription (by misdirection, visual interferences, confirmshaming). This may be achieved through making consumers go through many pages, asking them to confirm their choices in a manner that causes confusion with consumers, etc. Generally, the Norwegian survey looked into practices of digital service providers, where consumers would take out a subscription for services. Such subscriptions involve automatic payments, content is delivered online, and thus if consumers stop using a service they may forget about it, it becomes invisible to them. Therefore, it may be especially important to facilitate consumers' termination of such services. And yet, the survey found that 25% of respondents have experienced problems with cancelling such subscriptions due to a difficult process having been set up.

Today, it has reported that another Norwegian authority - Norwegian Data Protection Authority (Datatilsynet) - issued a fine of over 9.5 million Euro to the dating app Grindr (10% of their global annual revenue), following on the Norwegian Consumer Council's complaint from a year ago about infringements of privacy by this app (Historic victory for privacy as dating app receives gigantic fine). The breach of GDPR occurred due to the app collecting and sharing personal data without sufficiently informed and explicit users' permission to such practices (more in the report 'Out of control', on Grindr specifically as of p. 72).

Monday, 30 November 2020

GDPR complaints v Google: Will the (long) wait be worth it?

The European Consumer Organisation (BEUC) drew attention in the last week to the fact that when the national consumer organisations file complaints for infringements of the GDPR rules, the procedure is loooooong (Commercial surveillance by Google. Long delay in GDPR complaints). 
 
We have reported back in 2018 that several national consumer organisations have filed a complaint about Google's deceptive design (the use of dark patterns) to acquire their users' consent to constant tracking of their 'location history' (Google tracks every step you take). The complaints were lodged in various national data protection authorities in November 2019. It took until July 2019 for the decision to be made that the Irish Data Protection Commission will lead the investigation into the complaints. After another six months, in February 2020, we have found out that the Irish Data Protection Commission have also opened an investigation of their own motion (own volition inquiry). This means that there are two separate, but interlinked as they pertain to the similar reported infringements, procedures ongoing at the moment. 
 
Why? That is a very good question. What is the benefit of the opening up of a new procedure with/by the same authority? Supposedly, the own inquiry of the data protection authority will provide insights necessary to further resolve the submitted complaints. The date of any decision/report is unknown at the moment. Which raises the question whether submitting the GDPR complaints actually makes sense from the perspective of consumer protection. After all, whilst the procedure is ongoing and the complaints are pending, the reported practices have not been suspended.

Friday, 13 November 2020

Dark patterns and conditions for a valid consent to data processing - judgment of the CJEU in C‑61/19 Orange Romania

Earlier this week, the Court of Justice delivered a judgment in case C-61/19 Orange Romania, concerned with the conditions for a valid consent to the processing of personal data under EU data protection law (the Data Protection Directive 95/46/EC and the General Data Protection Regulation 2016/679, which remains in effect as of May 2018). The case follows up on the previous ruling in C-673/17 Planet49, on which we commented last year (see also: Planet49: Pre-Ticked Checkboxes Are Not Sufficient...). Aside from confirming the importance of an "active" consent, the Court elaborates on the requirement for consent to be informed, specific, unambiguous and freely given, building bridges to important categories known from consumer law, such as transparency and misleading practices.

Facts of the case

The dispute goes back to a fine imposed by the Romanian data protection authority on the provider of mobile telecommunications services, Orange România, for an allegedly unlawful storage of the copies of customers' identity documents. In particular, the authority argued, the data controller failed to demonstrate that the data subjects had given their valid consent to the contested processing. What makes the case interesting is that the storage of ID cards was, in fact, explicitly mentioned in the contracts which Orange concluded with its customers. Specifically, the following wording is cited:

"The customer states that: ... (ii) Orange România has provided the customer with all the necessary information to enable him or her to give his or her unvitiated, express, free and specific consent to the conclusion and express acceptance of the contract; (iii) he or she has been informed of, and has consented to [numerous types of processing, including the storage of copies of documents containing personal data for identification purposes]."

As seen from above, both the declaration of "consent" and the confirmation of having received the associated information were pre-forumlated by the trader. At least in certain cases they were also already "pre-ticked". In fact, however, consent to the storage of the copies of ID cards was not necessery for entering into a contract and customers, who refused to consent, were not prevented from the contract conclusion. Data subjects who did not wish their ID cards to be copied, though, were asked to go through additional steps, most notably confirm their refusal in a specific form, which, like pre-ticked checkboxes, can be regarded as an example of dark patterns in action (or, in this case, "sludge"). 

Against this backgroud, doubts have been raised, among others, as to whether the clauses on data processing were sufficiently distinct from the remaining parts of the documents, whether the data subjects were not misled about the possibility of refusing consent to the storage of ID cards and, if so, whether this could have an impact on the validity of their consent.

Legal provisions

Even though the contested fine was imposed on Orange România prior to the date of application of the GDPR, the Court of Justice decided to provide guidance on both Directive 95/46/EC and Regulation 2016/679. Key norms subject to the analysis where those laying down conditions for a valid consent. Focusing on the GDPR, attention should be drawn to its Article 6(1)(a), listing data subject's consent among the grounds for the lawful professing of his or her personal data, and to Article 4(11), which defines "consent" as any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her. Of relevance are further the associated information duties in Article 13 as well as (non-binding) clarification of the above in recitals 32 and 42.

Judgment of the Court

While the specific assessment of the case at hand has been left to the national court (in line with the nature of preliminary reference procedure), the judgment provides important guidance on the legal provisions to be applied. In particular:

  • The Court recalls that for consent to be validly expressed (by the data subject) and later demonstrated (by the controller), the corresponding wish of the data subject should be reflected in his or her active behaviour. In particular, unambiguous and informed consent cannot be inferred from the fact that the data subject did not deselect a pre-ticked checkbox (paras. 35-37, 45-46; on the burden of proof, see also paras. 42, 51).
  • The judgment goes on to discuss the definition of consent as a "specific" indication of data subject wishes, highlighting the requirements of Article 7(2) (presentation of the request for consent in a manner which is clearly distinguishable from the other matters) and recital 42 of the GDPR (presentation of pre-formulated declarations in an intelligible and easily accessible form, using clear and plain language). The latter is especially worth highlighting, as it directly refers to Directive 93/13/EEC on unfair terms in consumer contracts. Transparency of declarations is also considered relevant for establishing whether consent so expressed has been informed. What is more, corresponding information provided by the controller "must enable the data subject to be able to determine easily the consequences of any consent he or she might give", which again brings to mind the requirements for substantive transparency known from consumer law stricto sensu (paras. 38-40, 47-48). The latter may have significant impliactions for the validity of consent to the processing of personal data in the context of automated decision-making.
  • Finally, an important part of the judgment concerns the requirement for consent to be freely given (and again informed). In para. 41, the Court observes that "in order to ensure that the data subject enjoys genuine freedom of choice, the contractual terms must not mislead him or her as to the possibility of concluding the contract even if he or she refuses to consent to the processing of his or her data" (similarly para. 49). This brings to mind the notions of misleading actions and ommissions, known from Articles 6 and 7 of Directive 2005/29/EC on unfair commercial practices (note that the Directive refers directly to the "freedom of choice" only in the subsequent provision on aggressive practices). At a later point of the judgment, the Court also questions the free nature of consent in the case at hand in view of the additional burden (sludge) imposed by the controller on the data subjects who wish to refuse consent (para. 50). As in the other instances, however, an assessment is ultimately left to the referring court. 

Concluding thoughts

Overall, the judgment provides for a range of important reference points, which may help to increase the level of consumer and data protection in the EU. Worth noting are the recurring references to the requirement of an "informed" consent, which appears to complement and reinforce all other conditions. The judgment underlines the close connection between data protection and consumer law stricto sensu, which has long been observed in the literature. Recognition of the role of (substantive) transparency and of potentially misleading practices in assessing consent validity is also to be welcomed. Both seem especially relevant in the digital market, where the consequences of consent are often difficult to determine and where dark patterns remain prevalent.