Showing posts with label google. Show all posts
Showing posts with label google. Show all posts

Saturday, 31 December 2022

December wrap-up of data protection cases (Google, Österreichische Datenschutzbehörde and Pankki S)

The end of the month (and the end of the year as well) is a good moment for summaries. This time we are taking a closer look at events in the area of data protection law. December was a month with a couple of interesting events, so here is a brief recap. 

Dereferencing allegedly inaccurate content (C-460/20 Google)

The case concerned two executives of a group of investment companies (a board member and a proxy) who asked Google to remove search results linking their names to certain articles criticising the group's investment model. They exercised the so-called right to be forgotten, guaranteed under Article 17(1) of the GDPR, claiming that the information presented contained false claims and defamatory opinions. They also wanted Google to remove their thumbnail images from the search results. Google rejected these requests, arguing that it does not know whether the information contained in the articles is true or not.

In cases involving the erasure of data from a search engine operator's search results, two rights usually collide: the public's right of access to information (especially about persons holding public positions) and the individual's right to protection of his or her personal data, including the right to erasure, protection of his or her good name, image, etc. The same problems were considered in this case, as we wrote about when reporting on the AG's opinion issued in the proceedings. In the ruling of 8th December 2022 the Court held that the person requesting the deletion of data is obliged to show that the information is manifestly inaccurate. "However, in order to avoid imposing on that person an excessive burden which is liable to undermine the practical effect of the right to de-referencing, that person has to provide only evidence that, in the light of the circumstances of the particular case, can reasonably be required of him or her to try to find in order to establish that manifest inaccuracy" (para. 68). It means that such a person cannot be required to present a judicial decision made against the publisher of the website in question, even in the form of a decision given in interim proceedings, since it would be an unreasonable burden imposed on such a person. At the same time "the operator of the search engine concerned cannot be required to investigate the facts and, to that end, to organise an adversarial debate with the content provider seeking to obtain missing information concerning the accuracy of the referenced content" (para. 71). Therefore, if the person who made a request for de-referencing submits relevant and sufficient evidence showing the manifest inaccuracy of the information found in the referenced content, the operator of the search engine is required to accede to that request for de-referencingBut an operator should not grant a request if the inaccurate character of the information is not obvious in the light of the evidence presented (para. 72&73). 

As regards the thumbnails the Court concluded that "a separate weighing-up of competing rights and interests is required depending on whether the case concerns, on the one hand, articles containing photographs which are published on an internet page and which, when placed into their original context, illustrate the information provided in those articles and the opinions expressed in them, or, on the other hand, photographs displayed in the list of results in the form of thumbnails by the operator of a search engine outside the context in which they were published on the original internet page" (para. 101). The Court also stated that the informative value of those images should be taken into account independently of the context of their publication on the website from which they originate, nevertheless taking into account all the content that directly accompanies the display of those images in the search results and that can explain the informative value of those images (para. 108).

The concept of a "copy of personal data" under the Article 15(3) of the GDPR. AG Pitruzzella opinion on Österreichische Datenschutzbehörde case (C487/21)

The dispute arose over the interpretation of Article 15(3) of the GDPR, which provides that a data subject, as part of the right of access to one's personal data, may obtain a copy of that data. The complainant requested an exact copy of the data processed by the controller, including full copies of documents containing his personal data. However, the controller provided only some of the requested information as an aggregate that reproduced the stored personal data of the data subject in a table broken down by name, date of birth, street, postal code, and place, and in a statement summarising corporate functions and powers of representation. As part of the proceedings, the national court decided to refer several questions concerning the interpretation of Article 15(3) of the GDPR to the Court. 

On 15 December 2022, the AG delivered an opinion stating that the concept of “copy” referred to in Article 15(3) of the GDPR must be understood as "a faithful reproduction in intelligible form of the personal data requested by the data subject, in material and permanent form, that enables the data subject effectively to exercise his or her right of access to his or her personal data in full knowledge of all his or her personal data that undergo processing – including any further data that might be generated as a result of the processing, if those also undergo processing – in order to be able to verify their accuracy and to enable him or her to satisfy himself or herself as to the fairness and lawfulness of the processing so as to be able, where appropriate, to exercise further rights conferred on him or her by the GDPR". The AG underlined that this provision does not, in principle, entitle the data subject to obtain a full copy of documents containing the personal data, but, at the same time, does not exclude the need to provide that person with extracts from documents, whole documents or extracts from databases if that is necessary to ensure that the personal data undergoing processing are fully intelligible.

Right to know the identity of the persons who had access to one's personal data. AG Campos Sánchez-Bordona on Pankki S case (C-579/21)

The third case also concerned the right of access to personal data, but from a different perspective. Data subject wanted to know who exactly (among the employees of the financial institution) had access to his personal data at the time when he was a customer of that institution and an employee thereof. The controller refused to provide names of the employees arguing that Article 15 of the GDPR does not apply to log data of the institution's data processing system and that the information requested does not relate to personal data of the data subject, but to the personal data of the employees. 

The AG approved the controller's view and stated that Article 15(1) of the GDPR "does not give the data subject the right to know, from among the information available to the controller (where applicable, through records or log data), the identity of the employee or employees who, under the authority and on the instructions of the controller, have consulted his or her personal data". In justifying his opinion, he pointed out that "the identity of individual employees who have handled the processing of customer data is particularly sensitive information from a security point of view, at least in certain economic sectors" (para. 76). Disclosure of employees' data could expose them to attempts by customers of the banking institution to exert pressure and influence. Nevertheless, the AG noted that if a data subject has reasonable doubts about the integrity or impartiality of an individual who has participated on behalf of the controller in the processing of his or her data, this could justify the interest of that customer in knowing the identity of the employee in order to exercise the customer's right to take an action against that employee (para. 78; nb. in the relevant case the data subject made his request, in particular, in order to clarify the reasons for his dismissal).




Friday, 22 April 2022

Right to be forgotten vs. right of access to information. AG opinion in Google case (C-460/20)

Freepik (iconicbestiary)

One of the characters in a well-known movie called "The Social Network" said that the internet's not written in pencil, but it's written in ink. We know that information, once posted online, does not die, but circulates for many years. However, the General Data Protection Regulation (GDPR) guarantees us the right to erasure of our personal data (also known as the right to be forgotten), and the right to object to processing. Both rights can be exercised in certain situations specified in the regulation, such as when the data is processed unlawfully or because of a particular situation of the data subject. I will not go into details, as the purpose of this post is not to comment on the GDPR provisions, but to give an overview of an opinion delivered recently in the case C-460/20 Google by Advocate General Giovanni Pitruzzella. Although this is not the first case concerning deletion of personal data available on the Internet (see, for example, judgments of the Court of Justice in cases: C-131/12 Google Spain and Google, C-136/17 GC and Others, C-18/18 Glawischnig-Piesczek), this issue still raises doubts and will probably be the subject of preliminary questions more than once.


The case concerns the processing of personal data of a man holding important positions in financial services companies and his ex-partner who was a proxy in one of those companies. One of the websites published three articles that questioned the investment model adopted by some of the companies. In addition, it posted photos of the man and his ex-partner in a luxury car, a helicopter and in front of a plane. The photos, as well as the content of the articles, suggested that they were leading a sumptuous life at the expense of third parties. Because Google's search engine displayed links to pages with the articles in its search results, as well as thumbnail images of the articles, the plaintiffs requested that both the links to the pages and the thumbnails be removed from the list of search engine results. They claimed that they contained a number of erroneous allegations and defamatory opinions based on untrue facts. In their opinion, they were victims of blackmail by the website.


The German Bundesgerichtshof (Federal Court of Justice) has raised doubts about the interpretation of Article 17(3a) of the GDPR, a provision that entitles a controller to refuse to delete personal data if the processing is necessary for the exercise of the right to freedom of expression and information. The questions referred for a preliminary ruling thus concern the balancing of two conflicting fundamental rights guaranteed by the Charter of Fundamental Rights of the European Union: the right to information and freedom of expression, and the right to respect for private life and protection of personal data.


The Advocate General recognized and emphasized in his opinion the important role of "gatekeepers" played by search engines. Their activity is essential in ensuring universal, even democratic, access to information. As he points out, "in the vast ocean of information created on the Internet, much information would remain virtually inaccessible without the intermediation of these search engines" (para. 2). At the same time, search engines exercise control over the circulation of information on the Internet, since the inclusion of a link to certain websites in a search list, on the one hand, facilitates access to information for any Internet user and contributes to the dissemination of that information, while on the other hand, it may constitute a serious intrusion into the private sphere of the individuals to whom the information relates. Nevertheless, the right to respect for private life and to protect personal data are not absolute. According to the AG Pitruzzella, given the context of the case, and in particular the fact that the data subject performs a public function (more or less important, political or economic), it must be assumed that the right to information overrides the right to protection of personal data. He notes that "the confidence both of other economic operators and of consumers is a prerequisite for the proper functioning of the market. This confidence requires public access to information about persons in professional roles that is likely to affect market dynamics and consumer interests, sometimes even more markedly than the acts of policy makers. Naturally, this information is essentially that which relates to their professional roles, but can also extend to aspects of their private sphere where they are connected or, in any event, likely to impact their professional activity and affect public confidence" (para. 28). 

However, there are exceptions to the rule. The right to information will not prevail if the information presented is false, even if it concerns a person who plays an important role in society. Incorrect information not only violates the protection of personal data but also the dignity of the data subject by distorting his or her identity (para. 31). In such a situation, the right to data protection will enjoy priority. This conclusion was drawn by the Advocate General based on the principle of data accuracy formulated in Article 5(1d) of the GDPR, according to which personal data must be accurate and, where necessary, updated, while data that are inaccurate in light of the purposes of the processing must be erased or rectified without delay (para. 32). Data accuracy is one of the basic principles of the processing of personal data and its violation implies unlawfulness of the processing.


Freepik (rawpixel.com)
A special role in this aspect is played by the operator of the Internet search engine, who acts as a data controller and is therefore responsible for the entire data processing. Its task is to assess whether a request to remove links to websites or images, i.e. a de facto request to delete personal data, should be accepted. The search engine operator, acting as a controller under the GDPR, must balance the mentioned fundamental rights. How it should be done? In the AG's view, some kind of "procedural data due process" should be introduced. This is to impose certain obligations on both the data subject and the controller, which, although not explicitly stipulated in the regulation, can be interpreted from its content and are intended to serve the effective implementation of the right to be forgotten. First, if the data subject claims that information about him or her is false, he or she should provide a prima facie evidence for its falsity unless "this is, in particular in view of the nature of the information concerned, manifestly impossible or unduly difficult" (para. 44). Secondly, the controller should carry out the verification of the disputed information "which is within the scope of his concrete capacities". Thus, he should analyze all data in his possession as the operator of the search engine, using the technological tools available. Moreover, the operator of the search engine, where possibile, should "initiate rapidly an adversarial debate with the web publisher who initially disseminated the information, who will then be able to set out the reasons supporting the truth of the personal data processed and the lawfulness of the processing" (para. 45). Then, the operator will have to decide whether or not to grant the request for de-referencing. The request may be dismissed only "if substantial doubts remain as to whether the information in question is true or false, or if the weight of the false information in the context of the publication in question is manifestly insignificant and that information is not of a sensitive nature" (para. 46). The search engine operator is thus supposed to act as a quasi court or like an arbiter by actively seeking the truth. In conclusion, the AG believes that appropriate activity should be required on the part of the data subject (by making it plausible that the information is false) and on the part of the controller (by comprehensively verifying the accuracy of the information).

As far as the removal of thumbnail images displayed in the results of an image search is concerned, the AG Pitruzzella considers that the same principles should be applied here. The controller must also balance the rights, and in this case should take into account only the informational value of the images as such, regardless of the content they illustrate on the website from which they originate. Conversely, if "in connection with a request for de-referencing of the link to a web page, the display of photographs in the context of the content of that web page were contested, it would be the informative value that those photographs have in that context which should be taken into account for the purposes of that balancing exercise" (para. 56).


The AG's opinion is not surprising, as it is in line with the existing case law of the Court. The question is whether this position, assuming that the Court will follow it, will contribute to strengthening the position of data subjects vis-à-vis the controllers, i.e. Internet search engines? It seems that the argument of universal access to information, especially information about public figures (and in the age of the Internet the boundary between "public person" and "private person" is extremely fluid and unclear), can always be used as a justification for refusing to remove links to websites from the list of search results. The right to protection of personal data interferes here not only with the right to information, but indirectly also conflicts with the economic interests of the search engine operator who makes profits from such a business model. The more information, links, clicks and views, the better. The same is true for website operators who publish content on their portals. Even imposing high fines for unjustified refusal of a data deletion request, and thus violating GDPR regulations, does not deter the "big players". However, in order not to end with such a pessimistic tone, let's hope that as time goes by, this trend will reverse and the right to be forgotten will become an effective tool for removing incorrect information online that undermines someone's reputation. Like a metaphorical eraser that wipes off the ink with which one writes on the Internet*. 



*I refer to the words of Advocate General Maciej Szpunar in his opinion in Case C-18/18 (para. 2).

Tuesday, 1 February 2022

Cookies, Google Analytics, transfers of PRN data and new guidelines on the right of access… Wrapping-up January events in data protection


The New Year brought us some interesting developments in the data protection landscape. There are a few January facts worth noting:


Fines imposed on Google and Facebook for non-compliance with the cookie rules 
At the beginning of January*, the French supervisory authority, Commission Nationale de l'Informatique et des Libertés (CNIL), imposed a 150 million euro fine on Google and a 60 million euro fine on FACEBOOK IRELAND LIMITED - both for violations related to the use of cookies. According to the authority, users of sites owned by the companies (namely google.fr, youtube.com and facebook.com) cannot reject cookies as easily as they can accept them. Accepting cookies is possible with a single click of a button on the page, while the equivalent option is not available for refusing cookies. Denying consent to cookies requires more involvement on the part of the user and at least several clicks. As a result, such a complicated refusal mechanism may act as a disincentive for users, so that they are more likely to accept cookies against their will. This in turn violates Article 82 of the French law transposing the provisions of the e-Privacy Directive. It also fails to meet the requirements of legally binding consent under the GDPR.
Freepik.com
As a reminder, this is not the first sanction imposed by the CNIL on Google. In December 2020, the CNIL also fined Google LLC and Google Ireland Limited 100 million euro, because a large number of cookies used for advertising purposes was automatically deposited on a user's computer, without obtaining prior consent and without providing adequate information. The Google companies filed an appeal against the decision, but the French Council of State in late January 2022 upheld the CNIL's decision


Use of Google Analytics not compliant with the GDPR
January was not a successful month for Google in terms of data protection. In addition to the above penalties, the Austrian Data Protection Authority found that a tool used on many websites, Google Analytics, violates the protection of EU citizens' personal data.** Why? Because the tool transfers personal data to the United States, and in the US, Europeans' personal data is not adequately protected. Previously, personal data from the EU to the US could be transferred under the EU Commission's decision on the adequacy of the protection provided by the EU-US Privacy Shield, but since the CJEU declared that decision invalid in mid-July 2020, data controllers should base data transfers on a different legal ground (for example, on standard contractual clauses). The problem is that the US law does not provide sufficient protection against access to personal data by various public authorities, regardless of the legal basis on which personal data is transferred. And regardless of the fact that EU-US data transfers became illegal literally overnight, many companies continue to transfer personal data to the United States, mainly using IT tools provided by US companies, just like Google Analytics or other similar technologies. The decision of the Austrian authority is therefore not surprising, but it certainly provides another confirmation that transfers of personal data to the US are legally questionable. Companies should examine their practices and consider choosing alternative European IT tool providers. But not only companies! Looks like the European Parliament should too - the European Data Protection Supervisor also issued a decision in January this year in which he questioned the legality of data transfers collected via cookies on one of the EP's websites. 

Freepik.com
EU rules on the collection of air passenger information are in line with the EU Charter of Fundamental Rights and the GDPR, but with some reservations

On the 27th of January, AG Pitruzzella delivered his opinion in case C-817/19 Ligue des droits humains concerning, inter alia, the interpretation of the provisions of Directive 2016/681 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime. AG Pitruzzella assumes that the transfer of PNR data and the pre-travel screening of air passengers by means of automated processing of such data is generally compatible with Articles 7 and 8 of the EU Charter of Fundamental Rights. However, he also pointed out that such data should only be stored when necessary in view of a serious and genuine threat to security and for a period limited to the minimum necessary. 

This case deserves a wider comment and a separate blog post, so we will come back to this topic shortly, as soon as the English version of the opinion is published on the Court's website. 



Guidelines on data subject rights - right of access

Finally, at the end of January, the European Data Protection Board published new guidelines on data subjects' rights, specifically on the right of access to data. For the time being, this is the version for public consultation. The feedback period is now open, so make your voice heard until March 11th!


* To be precise - CNIL's decisions were issued on December 31, 2021, but the information about the fines was published on the authority's official website in the first days of January. 
** Again, the decision was issued just before Christmas, but published on January 12, 2022. 



Monday, 30 November 2020

GDPR complaints v Google: Will the (long) wait be worth it?

The European Consumer Organisation (BEUC) drew attention in the last week to the fact that when the national consumer organisations file complaints for infringements of the GDPR rules, the procedure is loooooong (Commercial surveillance by Google. Long delay in GDPR complaints). 
 
We have reported back in 2018 that several national consumer organisations have filed a complaint about Google's deceptive design (the use of dark patterns) to acquire their users' consent to constant tracking of their 'location history' (Google tracks every step you take). The complaints were lodged in various national data protection authorities in November 2019. It took until July 2019 for the decision to be made that the Irish Data Protection Commission will lead the investigation into the complaints. After another six months, in February 2020, we have found out that the Irish Data Protection Commission have also opened an investigation of their own motion (own volition inquiry). This means that there are two separate, but interlinked as they pertain to the similar reported infringements, procedures ongoing at the moment. 
 
Why? That is a very good question. What is the benefit of the opening up of a new procedure with/by the same authority? Supposedly, the own inquiry of the data protection authority will provide insights necessary to further resolve the submitted complaints. The date of any decision/report is unknown at the moment. Which raises the question whether submitting the GDPR complaints actually makes sense from the perspective of consumer protection. After all, whilst the procedure is ongoing and the complaints are pending, the reported practices have not been suspended.

Thursday, 14 February 2019

French tribunal invalidates many terms in Google+ T&Cs

On Tuesday, the Tribunal de Grande Instance of Paris decided on a claim presented by the French consumer association Que choisir against Google and challenging the company's practices and contract terms involved in the (recently discontinued) Google+ service. 

The association challenged Google's Terms of Service and Privacy policy in their entirety, but also a large number of individual clauses contained therein. 

The Court analysed these terms in light of consumer legislation, and in particular unfair terms provisions in the Code de la consommation, and data protection rules. They also, possibly quite crucially, relied on a number of provisions in the same Code which dictate the information which consumers must receive prior to contract conclusion. 

Different types of terms were, in this context, considered as invalid:

1) Terms which described the purpose of data collection in a way that did not allow the consumer to really understand what their information was going to be used for
In particular, the Court condemned certain terms for presenting data collection as (exclusively) aimed at providing better services, rather than making the consumer aware of the commercial value and utilisation of the information collected (see clause 4 privacy policy, p. 88 of the decision).

2) Terms concerning geo-localisation
In this case, the main challenge is that the geo-localisation information is in no direct connection to the service and takes place through connecting to information stored by different services. Consumers should, the decision implies, have the chance to accept or reject this separately. See Clause 9 Privacy Policy, p. 93.

3) Terms allowing the provider to change the data concerning certain users, and to keep a log of old data that a user has sought to rectify
This is against data protection principles, which put individuals in control of their personal data after it has been collected. See clause 14 and 17 Privacy policy, p 98-100. 

4) Terms requiring users to accept that their information may be stored outside of the EU/EEA, without safeguards 
Such terms are not so much unfair as they are plainly in contrast with mandatory rules restricting the transmission of data outside of the EEA, except when provided for by "safe harbour" agreements. See Clause 19 Privacy Policy, p. 102.

5) Terms allowing the provider to change their conditions, or to terminate the provision, without indication on which grounds such measures could be taken

6) One of the terms in the Terms of Use was declared invalid for its attempt at waiving all sort of liabilities without any clear delimitation of the waiver's reach

7) Another term, concerning cookies, alerted consumers that "not all services" could reasonably work without them, but did not give any indication as to what the specific impact of refusing cookie collection could be 

The Court considered both the Terms of Use and the Privacy Policy as parts of one global contractual agreement. Contrary to the association's submissions, it considered that in itself, the presentation of the two documents was sufficient to provide users information concerning the nature and scope of what consumers agree to: in particular, the use of hyperlinks and "fragmentation" of relevant information is suitable to avoid an excessive concentration of information in a single text in limited space, the lexicon is sufficiently informal and it includes a glossary, and the personal nature of the information processed is sufficiently highlighted. 

In particular to the extent that, such as for geo-localisation, the Court seems to indicate separate approval - i.e., approval that is not obligatory in order to get access to the service - Que choisir has commented that the decision marks an end to "les conditions générales interminables à accepter en bloc". In some cases, where the terms contested were plainly against data protection legislation, the decision should also mean that the terms should no more be employed. 

On the other hand, in so far as transparency was the reason for invalidating many of the controversial clauses, it will remain to be seen what the practical consequences of the decision - which, is, furthermore, still subject to appeal - will be. Interesting times!

(A PDF copy of the decision, in French, is available on Que Choisir's website as linked above)

Wednesday, 30 January 2019

More bad news for Google on the data protection front: Polish NGO files a complaint

Last week we reported about a decision of the French Data Protection Authority - the CNIL - imposing a €50 million fine on Google for alleged infringement of the European data protection rules. This, however, does not seem to be the end of Google's headaches. Earlier this week, a Polish NGO - Panoptykon - filed a complaint against the company with the President of the national Personal Data Protection Office.

Besides the complaints' addressee the two cases do not seem to have much in common. As a matter of fact, Google is not the only entity against which Panoptykon complained. A separate complaint was lodged against IAB Europe, an industry association in the field of interactive marketing. Both complaints concern the functioning of the market for online behavioural advertising, in which, according to Panoptykon, IAB and Google are key players.

The developments in Poland are a direct follow-up to the two complaints lodged last September in Ireland and the UK by Brave ("a privacy-focused web browser" set up by Mozilla's co-founder Brendan Eich) and Open Rights Group. Their focus remains on the real-time bidding (RTB) system used in the advertising market, which the applicants believe to infringe General Data Protection Regulation on at least several counts (for a rough explanation of the system see a video uploaded by... the IAB itself; further reference can be made to a report by Johnny Ryan of Brave). Key arguments of the complaining organisations concern the lack of a valid legal basis, including for the processing of sensitive data, failure to ensure data security, and the lack of appropriate control tools for data subjects (e.g. to verify and correct their marketing profiles).

Not surprisingly, Panoptykon's campaign has met with animated reactions. The President of IAB Poland drew a parallel to complaining against "a car producer for [producing cars] having technical abilities of breaking traffic rules, like exceeding speed limits or parking in restricted areas". He recalled the Transparency & Consent Framework created by the association with the aim to "help the businesses [involved in the ecosystem] to comply with applicable law". He also insisted that the RTB system is by no means "directed" by IAB Europe. Panoptykon, on the other hand, described the association as a "standard-setter" who, sticking to the traffic metaphor, "laid down the rules to be followed on its private roads in a way that makes it impossible to drive safely". 

All eyes are now on the Polish DPO, who is widely regarded as an expert in the field. Panoptykon encourages the authority to engage in a joint operation with its British and Irish counterparts based on Article 62 of the GDPR. Thus, similarly to the French proceedings, the commented case seems like an important test for the GDPR's procedural framework.


Monday, 21 January 2019

€50m fine imposed on Google by the French DPA

Just several days ago we reported about two opinions of Advocate-General Szpunar in cases involving the French data protection authority - Commission for Information Technology and Civil Liberties (CNIL) and the US digital giant Google. We mentioned that both cases concerned the interpretation of the Data Protection Directive, the predecessor of the currently applicable General Data Protection Regulation. Earlier today the CNIL issued yet another decision, once again directed against Google, this time blazing the trail for the application of new data protection rules.

The decision, imposing a 50 million euro fine on GOOGLE LLC, is bound to raise both substantive and procedural questions. Unlike previous cases, which primarily focused on the right to be forgotten, the decision issued today concerns the alleged "lack of transparency, inadequate information and lack of valid consent regarding the ads personalization". Indeed, the GDPR has further specified the data controllers' transparency obligations, the requirements for a valid consent and the data subjects' information rights and has backed them by effective sanctions. The emerging case practice clearly illustrates the growing importance of data protection law for the protection of consumer interests in the digital age. The GDPR-based complaint filed by several European consumer organizations concerning Google's practices of location tracking, on which we reported last November, further exemplifies this trend.

On the procedural side, the question may arise whether the French DPA was at all competent to deal with the case considering the "one-stop-shop mechanism" introduced by the GDPR. The CNIL seems to argue that the mechanism was not applicable in the present case due to the lack of Google's main establishment in the EU. Considering the growing interests in the company's data processing practices across the European jurisdictions, Google's appeal against this finding would be anything but surprising (update: an appeal has in the meantime been confirmed). Incidentally, in the 'location data' case, the respective complaint had been lodged with the Norwegian DPA, highlighting the relevance of the matter for the whole EEA.

Full text of today's CNIL decision (in French) can be consulted here.

Tuesday, 27 November 2018

Google tracks every step you take

Norwegian Consumer Council and seven European consumer organisations have filed a complaint today against Google, arguing that Google uses deceptive design and misleading information in order to acquire users' consent to constant tracking (New study: Google manipulates users into constant tracking). How would the users be tracked? Well, if you have an Android phone or use Google accounts on other devices, then it is likely that you are one of the victims of constant tracking, as Google accounts have 'location history' and 'web&app activity' integrated in their settings. You may have been prompted/manipulated to switch on such location history, without having realised that. Having read the above information you may think that Google has 'simply' access to your GPS data. Would you know though how detailed this data is (incl. determining which floor you are at in a particular building, which room in the house) and that it may be linked to other information, e.g., your online search results? The combined data may, of course, then be used for targeted advertising, increasing its effectiveness. Would you know how to switch it off and how to avoid having it be switched on again? If this short post does not make you want to check your phone and its settings, maybe you should look up the whole report on the study of Google tracking practices that has been published: Every step you take.


Friday, 20 July 2018

Record fine for Google for breaching EU antitrust rules: is there anything for consumers?


Earlier this week, on the 18th of July, the European Commission fined Google €4.34 billion for breaching EU antitrust rules. This is so far the largest fine ever imposed for such violations.

It is now evident that since 2011 Google imposed illegal restrictions on other Android device manufacturers and mobile network operators abusing their dominant position on the markets of: general internet search serviceslicensable smart mobile operating systems and app stores for the Android mobile operating system.

In particular Google 1) required manufacturers to pre-install the Google Search app and browser app (Chrome), as a condition for licensing Google's app store (the Play Store), engaging in the so called illegal practice of ‘tying’: 2) made illegal payments to certain large manufacturers and mobile network operators on condition that they exclusively pre-installed the Google Search app on their devices; and 3) illegally prevented manufacturers wishing to pre-install Google apps from selling even a single smart mobile device running on alternative versions of Android that were not approved by Google. Google's conduct prevented a number of large manufacturers from developing and selling devices based on Amazon's Android fork called "Fire OS".

The antitrust decision requires Google to bring its illegal conduct to an end in within 90 days of the decision. At a minimum, Google has to stop any of the above three types of illegal practices. The decision also requires Google to refrain from any measure that has the same or an equivalent object or effect as these practices. The Commission will monitor compliance with the decision, and in the event of failure to comply, Google can face payment of a fine of up to 5% of its average daily worldwide turnover.

This decision is beneficial for consumers in two ways. First, by stopping the abuse of dominant position, the decision is likely to result in increased competition in the given markets that brings better products and lower prices for consumers. Second, harmed consumers are able to claim compensation in civil actions for damages in their national courts based on the new EU Antitrust Damages Directive.