Showing posts with label right to be forgotten. Show all posts
Showing posts with label right to be forgotten. Show all posts

Saturday, 31 December 2022

December wrap-up of data protection cases (Google, Österreichische Datenschutzbehörde and Pankki S)

The end of the month (and the end of the year as well) is a good moment for summaries. This time we are taking a closer look at events in the area of data protection law. December was a month with a couple of interesting events, so here is a brief recap. 

Dereferencing allegedly inaccurate content (C-460/20 Google)

The case concerned two executives of a group of investment companies (a board member and a proxy) who asked Google to remove search results linking their names to certain articles criticising the group's investment model. They exercised the so-called right to be forgotten, guaranteed under Article 17(1) of the GDPR, claiming that the information presented contained false claims and defamatory opinions. They also wanted Google to remove their thumbnail images from the search results. Google rejected these requests, arguing that it does not know whether the information contained in the articles is true or not.

In cases involving the erasure of data from a search engine operator's search results, two rights usually collide: the public's right of access to information (especially about persons holding public positions) and the individual's right to protection of his or her personal data, including the right to erasure, protection of his or her good name, image, etc. The same problems were considered in this case, as we wrote about when reporting on the AG's opinion issued in the proceedings. In the ruling of 8th December 2022 the Court held that the person requesting the deletion of data is obliged to show that the information is manifestly inaccurate. "However, in order to avoid imposing on that person an excessive burden which is liable to undermine the practical effect of the right to de-referencing, that person has to provide only evidence that, in the light of the circumstances of the particular case, can reasonably be required of him or her to try to find in order to establish that manifest inaccuracy" (para. 68). It means that such a person cannot be required to present a judicial decision made against the publisher of the website in question, even in the form of a decision given in interim proceedings, since it would be an unreasonable burden imposed on such a person. At the same time "the operator of the search engine concerned cannot be required to investigate the facts and, to that end, to organise an adversarial debate with the content provider seeking to obtain missing information concerning the accuracy of the referenced content" (para. 71). Therefore, if the person who made a request for de-referencing submits relevant and sufficient evidence showing the manifest inaccuracy of the information found in the referenced content, the operator of the search engine is required to accede to that request for de-referencingBut an operator should not grant a request if the inaccurate character of the information is not obvious in the light of the evidence presented (para. 72&73). 

As regards the thumbnails the Court concluded that "a separate weighing-up of competing rights and interests is required depending on whether the case concerns, on the one hand, articles containing photographs which are published on an internet page and which, when placed into their original context, illustrate the information provided in those articles and the opinions expressed in them, or, on the other hand, photographs displayed in the list of results in the form of thumbnails by the operator of a search engine outside the context in which they were published on the original internet page" (para. 101). The Court also stated that the informative value of those images should be taken into account independently of the context of their publication on the website from which they originate, nevertheless taking into account all the content that directly accompanies the display of those images in the search results and that can explain the informative value of those images (para. 108).

The concept of a "copy of personal data" under the Article 15(3) of the GDPR. AG Pitruzzella opinion on Österreichische Datenschutzbehörde case (C487/21)

The dispute arose over the interpretation of Article 15(3) of the GDPR, which provides that a data subject, as part of the right of access to one's personal data, may obtain a copy of that data. The complainant requested an exact copy of the data processed by the controller, including full copies of documents containing his personal data. However, the controller provided only some of the requested information as an aggregate that reproduced the stored personal data of the data subject in a table broken down by name, date of birth, street, postal code, and place, and in a statement summarising corporate functions and powers of representation. As part of the proceedings, the national court decided to refer several questions concerning the interpretation of Article 15(3) of the GDPR to the Court. 

On 15 December 2022, the AG delivered an opinion stating that the concept of “copy” referred to in Article 15(3) of the GDPR must be understood as "a faithful reproduction in intelligible form of the personal data requested by the data subject, in material and permanent form, that enables the data subject effectively to exercise his or her right of access to his or her personal data in full knowledge of all his or her personal data that undergo processing – including any further data that might be generated as a result of the processing, if those also undergo processing – in order to be able to verify their accuracy and to enable him or her to satisfy himself or herself as to the fairness and lawfulness of the processing so as to be able, where appropriate, to exercise further rights conferred on him or her by the GDPR". The AG underlined that this provision does not, in principle, entitle the data subject to obtain a full copy of documents containing the personal data, but, at the same time, does not exclude the need to provide that person with extracts from documents, whole documents or extracts from databases if that is necessary to ensure that the personal data undergoing processing are fully intelligible.

Right to know the identity of the persons who had access to one's personal data. AG Campos Sánchez-Bordona on Pankki S case (C-579/21)

The third case also concerned the right of access to personal data, but from a different perspective. Data subject wanted to know who exactly (among the employees of the financial institution) had access to his personal data at the time when he was a customer of that institution and an employee thereof. The controller refused to provide names of the employees arguing that Article 15 of the GDPR does not apply to log data of the institution's data processing system and that the information requested does not relate to personal data of the data subject, but to the personal data of the employees. 

The AG approved the controller's view and stated that Article 15(1) of the GDPR "does not give the data subject the right to know, from among the information available to the controller (where applicable, through records or log data), the identity of the employee or employees who, under the authority and on the instructions of the controller, have consulted his or her personal data". In justifying his opinion, he pointed out that "the identity of individual employees who have handled the processing of customer data is particularly sensitive information from a security point of view, at least in certain economic sectors" (para. 76). Disclosure of employees' data could expose them to attempts by customers of the banking institution to exert pressure and influence. Nevertheless, the AG noted that if a data subject has reasonable doubts about the integrity or impartiality of an individual who has participated on behalf of the controller in the processing of his or her data, this could justify the interest of that customer in knowing the identity of the employee in order to exercise the customer's right to take an action against that employee (para. 78; nb. in the relevant case the data subject made his request, in particular, in order to clarify the reasons for his dismissal).




Friday, 22 April 2022

Right to be forgotten vs. right of access to information. AG opinion in Google case (C-460/20)

Freepik (iconicbestiary)

One of the characters in a well-known movie called "The Social Network" said that the internet's not written in pencil, but it's written in ink. We know that information, once posted online, does not die, but circulates for many years. However, the General Data Protection Regulation (GDPR) guarantees us the right to erasure of our personal data (also known as the right to be forgotten), and the right to object to processing. Both rights can be exercised in certain situations specified in the regulation, such as when the data is processed unlawfully or because of a particular situation of the data subject. I will not go into details, as the purpose of this post is not to comment on the GDPR provisions, but to give an overview of an opinion delivered recently in the case C-460/20 Google by Advocate General Giovanni Pitruzzella. Although this is not the first case concerning deletion of personal data available on the Internet (see, for example, judgments of the Court of Justice in cases: C-131/12 Google Spain and Google, C-136/17 GC and Others, C-18/18 Glawischnig-Piesczek), this issue still raises doubts and will probably be the subject of preliminary questions more than once.


The case concerns the processing of personal data of a man holding important positions in financial services companies and his ex-partner who was a proxy in one of those companies. One of the websites published three articles that questioned the investment model adopted by some of the companies. In addition, it posted photos of the man and his ex-partner in a luxury car, a helicopter and in front of a plane. The photos, as well as the content of the articles, suggested that they were leading a sumptuous life at the expense of third parties. Because Google's search engine displayed links to pages with the articles in its search results, as well as thumbnail images of the articles, the plaintiffs requested that both the links to the pages and the thumbnails be removed from the list of search engine results. They claimed that they contained a number of erroneous allegations and defamatory opinions based on untrue facts. In their opinion, they were victims of blackmail by the website.


The German Bundesgerichtshof (Federal Court of Justice) has raised doubts about the interpretation of Article 17(3a) of the GDPR, a provision that entitles a controller to refuse to delete personal data if the processing is necessary for the exercise of the right to freedom of expression and information. The questions referred for a preliminary ruling thus concern the balancing of two conflicting fundamental rights guaranteed by the Charter of Fundamental Rights of the European Union: the right to information and freedom of expression, and the right to respect for private life and protection of personal data.


The Advocate General recognized and emphasized in his opinion the important role of "gatekeepers" played by search engines. Their activity is essential in ensuring universal, even democratic, access to information. As he points out, "in the vast ocean of information created on the Internet, much information would remain virtually inaccessible without the intermediation of these search engines" (para. 2). At the same time, search engines exercise control over the circulation of information on the Internet, since the inclusion of a link to certain websites in a search list, on the one hand, facilitates access to information for any Internet user and contributes to the dissemination of that information, while on the other hand, it may constitute a serious intrusion into the private sphere of the individuals to whom the information relates. Nevertheless, the right to respect for private life and to protect personal data are not absolute. According to the AG Pitruzzella, given the context of the case, and in particular the fact that the data subject performs a public function (more or less important, political or economic), it must be assumed that the right to information overrides the right to protection of personal data. He notes that "the confidence both of other economic operators and of consumers is a prerequisite for the proper functioning of the market. This confidence requires public access to information about persons in professional roles that is likely to affect market dynamics and consumer interests, sometimes even more markedly than the acts of policy makers. Naturally, this information is essentially that which relates to their professional roles, but can also extend to aspects of their private sphere where they are connected or, in any event, likely to impact their professional activity and affect public confidence" (para. 28). 

However, there are exceptions to the rule. The right to information will not prevail if the information presented is false, even if it concerns a person who plays an important role in society. Incorrect information not only violates the protection of personal data but also the dignity of the data subject by distorting his or her identity (para. 31). In such a situation, the right to data protection will enjoy priority. This conclusion was drawn by the Advocate General based on the principle of data accuracy formulated in Article 5(1d) of the GDPR, according to which personal data must be accurate and, where necessary, updated, while data that are inaccurate in light of the purposes of the processing must be erased or rectified without delay (para. 32). Data accuracy is one of the basic principles of the processing of personal data and its violation implies unlawfulness of the processing.


Freepik (rawpixel.com)
A special role in this aspect is played by the operator of the Internet search engine, who acts as a data controller and is therefore responsible for the entire data processing. Its task is to assess whether a request to remove links to websites or images, i.e. a de facto request to delete personal data, should be accepted. The search engine operator, acting as a controller under the GDPR, must balance the mentioned fundamental rights. How it should be done? In the AG's view, some kind of "procedural data due process" should be introduced. This is to impose certain obligations on both the data subject and the controller, which, although not explicitly stipulated in the regulation, can be interpreted from its content and are intended to serve the effective implementation of the right to be forgotten. First, if the data subject claims that information about him or her is false, he or she should provide a prima facie evidence for its falsity unless "this is, in particular in view of the nature of the information concerned, manifestly impossible or unduly difficult" (para. 44). Secondly, the controller should carry out the verification of the disputed information "which is within the scope of his concrete capacities". Thus, he should analyze all data in his possession as the operator of the search engine, using the technological tools available. Moreover, the operator of the search engine, where possibile, should "initiate rapidly an adversarial debate with the web publisher who initially disseminated the information, who will then be able to set out the reasons supporting the truth of the personal data processed and the lawfulness of the processing" (para. 45). Then, the operator will have to decide whether or not to grant the request for de-referencing. The request may be dismissed only "if substantial doubts remain as to whether the information in question is true or false, or if the weight of the false information in the context of the publication in question is manifestly insignificant and that information is not of a sensitive nature" (para. 46). The search engine operator is thus supposed to act as a quasi court or like an arbiter by actively seeking the truth. In conclusion, the AG believes that appropriate activity should be required on the part of the data subject (by making it plausible that the information is false) and on the part of the controller (by comprehensively verifying the accuracy of the information).

As far as the removal of thumbnail images displayed in the results of an image search is concerned, the AG Pitruzzella considers that the same principles should be applied here. The controller must also balance the rights, and in this case should take into account only the informational value of the images as such, regardless of the content they illustrate on the website from which they originate. Conversely, if "in connection with a request for de-referencing of the link to a web page, the display of photographs in the context of the content of that web page were contested, it would be the informative value that those photographs have in that context which should be taken into account for the purposes of that balancing exercise" (para. 56).


The AG's opinion is not surprising, as it is in line with the existing case law of the Court. The question is whether this position, assuming that the Court will follow it, will contribute to strengthening the position of data subjects vis-à-vis the controllers, i.e. Internet search engines? It seems that the argument of universal access to information, especially information about public figures (and in the age of the Internet the boundary between "public person" and "private person" is extremely fluid and unclear), can always be used as a justification for refusing to remove links to websites from the list of search results. The right to protection of personal data interferes here not only with the right to information, but indirectly also conflicts with the economic interests of the search engine operator who makes profits from such a business model. The more information, links, clicks and views, the better. The same is true for website operators who publish content on their portals. Even imposing high fines for unjustified refusal of a data deletion request, and thus violating GDPR regulations, does not deter the "big players". However, in order not to end with such a pessimistic tone, let's hope that as time goes by, this trend will reverse and the right to be forgotten will become an effective tool for removing incorrect information online that undermines someone's reputation. Like a metaphorical eraser that wipes off the ink with which one writes on the Internet*. 



*I refer to the words of Advocate General Maciej Szpunar in his opinion in Case C-18/18 (para. 2).

Tuesday, 24 September 2019

No one-size-fits-all approach to search engine de-referencing - CJEU in Google

Earlier this year we reported on the two opinions of Advocate General Szpunar concerning several aspects of the right to be forgotten: 1) the role of search engine operators in relation to sensitive data; 2) the nature of the respective obligation to respond to de-referencing requests; and 3) territorial reach of required de-referencing measures.

Today the Court of Justice delivered judgments in both cases. Importantly, despite the fact that the questions were referred from the point of view of Directive 95/46, the Court also took General Data Protection Regulation 2016/679 into account (by which Directive was replaced in the meantime), in order to ensure "that its answers will in any event be of use to the referring court".

Source: Pixabay
The direction of both judgments generally remains in line with the interpretation proposed in both opinions. In case C-136/17, the Court confirmed that restrictions on the processing of certain categories of sensitive data apply also to operators of search engines. Like in AG's opinion, that prohibition was nonetheless read in the context of responsibilities, powers and capabilities of search engine operators. Restrictions on the processing of sensitive data thus concern the stage of ex post verification triggered by a request from the data subject. The judgment further lays down which steps a search engine operator must take when assessing the notification (and these are far from trivial).

Judgment in case C-507/17 concerned the territorial scope of de-referencing measures which a search engine operator must take. The Court referred to the objective of ensuring a high level of protection of personal data in the EU, pursued by both Directive 95/46 and Regulation 2016/679. It further admitted that a de-referencing carried out on all the versions of a search engine would meet that objective in full and argued that the EU legislature enjoys competence to lay down such an obligation (para. 58). That being said, the Court considered that the EU lawmakers have not done so, thus far. In consequence, for the time being, EU data protection law does not require search engine operators to carry out a de-referencing on all world-wide versions of a search engine. Importantly, however, the Court also did not exclude a possibility for a supervisory or judicial authority of a Member State to weigh up, in the light of national standards of protection of fundamental rights, a data subject’s right to privacy and the protection of personal data concerning him or her, on the one hand, and the right to freedom of information, on the other, and, where appropriate, to order such de-referencing (para. 72).

As regards the EU, the Court began by observing that, in principle, de-referencing is to be carried out in respect of all Member States (para. 66) and, if necessary, the search engine operator should be obliged to take sufficiently effective measures to ensure the effective protection of the data subject’s fundamental rights. Measures of this kind should have the effect of preventing or, at the very least, seriously discouraging internet users in the Member States from gaining access to the links in question while searching on the basis of that data subject’s name (para. 70). The Court left the question open whether automatic redirecting to a different national version of the search engine's website constitutes such a measure. It would seem that such blocking or redirection would then fall under the exception to customers' right of access to online interfaces, set out in Article 3(3) of Regulation 2018/302 on geo-blocking

At the same time, however, the Court accepted that the interest of the public in accessing information may, even within the Union, vary from one Member State to another, meaning that results of the balancing exercise are not necessarily the same for all the Member States. The Court thus emphasized the role of cooperation between supervisory authorities in the Member States as an adequate framework for reconciling the conflicting rights and freedoms. It is through this framework, therefore, that a de-referencing decision, covering all searches conducted from the territory of the Union on the basis of a data subject’s name, should be adopted (para. 69).

Sunday, 13 January 2019

Two opinions of AG Szpunar on the right to be forgotten

Last week also brought new developments regarding the interpretation of the right to be forgotten - a widely discussed right of data subjects developed by the Court of Justice in its earlier jurisprudence (see our 2014 post Google as data controller...). More specifically, Advocate-General Szpunar delivered his opinions in the two pending cases: C-136/17 G.C. and Others v CNIL and C-507/17 Google v CNIL. Just like Google Spain, both cases relate to Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data (and not yet the General Data Protection Regulation). Both are also concerned with the scope of search engine operators' obligation to respond to de-referencing requests by data subjects. 

Background of the cases

Both references of the French Conseil d’État pertained to disputed decisions of the national data protection authority (Commission for Information Technology and Civil Liberties, CNIL). The setting of each case was nevertheless quite different. In C-136/17 the CNIL refused to take measures against Google for failing to de-reference various links from search results and the affected data subjects complained about inaction. In C-507/17, by contrast, the search engine provider contested the sanctions imposed by the authority.

AG's opinions

The opinions presented last Thursday by the Advocate-General Szpunar shed light on several important aspects of the right to be forgotten: 1) the role of search engine operators in relation to sensitive data, 2) the nature of the respective obligation to respond to de-referencing requests, and 3) territorial reach of required de-referencing measures.

Processing sensitive data by search engine operators

As readers may recall, one of the controversial elements of the 2014 Google Spain judgment was the qualification of search engine operators as data controllers. This implied that the processing of personal data in the course of relevant activities needed to be authorized under one of the legal bases set out in the Directive. While the broader implications of this finding may not have been immediately apparent in the case of non-sensitive data, the picture became more complex as soon as special categories of data (e.g. about religious or philosophical beliefs) came into play. One of the questions asked in G.C. and Others was thus whether the prohibition of processing data falling within certain specific categories also applied to search engine operators.

The Advocate-General sought a balanced solution. He essentially replied in the affirmative, but observed that specific responsibilities, powers and capabilities of search engine operators should be taken into account as part of the interpretation. In particular, it was recognized that the processing carried out by such entities is secondary in its nature (an argument Google already tried to advance in the 2014 case). Hence, according to the AG, prohibitions and restrictions set out in the Data Protection Directive could only apply to an operator of search engine by reason of his referencing activities (searching, finding and making information available in an efficient way). Ex ante control of referenced web pages, which - so the AG - is covered neither by the responsibility, nor by capabilities of search engine providers, should therefore be excluded. Consequently, also with respect to sensitive categories of data, the primary focus remains on ex post verification of de-referencing requests, which was the subject of remaining questions.

Systematic de-referencing

In respect to the search engine operator's de-referencing duty (as a correlate of data subject's right to be forgotten), the Advocate-General first considered whether search engine operators are obliged to systematically de-reference web pages on which sensitive data appear, as soon as the absence of a legal ground for the processing is established. This matter appears to have divided the intervening parties and certainly needs to be looked at in more detail after all language versions of the opinion are available. For the time being, it suffices to report that, in view of the AG, an operator of a search engine should generally be required to accede, as a matter of course (i.e. without regard to elements other than the lack of legal ground), to requests for de-referencing relating to web pages on which sensitive data appear, subject to limited exceptions provided for in Article 8. Notably, however, if the contested processing of personal data falls within the scope of Article 9 of Directive 95/46, i.e. when the processing is carried out solely for journalistic, artistic or literary purposes, a balancing exercise can be required, possibly resulting in the refusal of de-referencing requests.

Territorial scope

The second of the discussed cases, Google v CNIL, dealt with the territorial scope of de-referencing measures. By way of illustration: in case of a request from a French data subject, should Google only deactivate links on Google.fr, on all EU domains, or on all worldwide domains? Or perhaps such de-referencing should (also) depend on the location from which the search is performed (assessed based on the IP address)? It this respect, the AG decided to put limits on the CNIL's extraterritorial ambitions. In particular, he insisted that search requests made outside the EU should not be affected by the de-referencing of search results. A different (broader) interpretation could, in view of the AG, create significant limitations in access to information, and as such should be approached with caution. Considering the facts of the case, worldwide de-referencing duty did not appear justified.

When it comes to the EU, however, the Advocate-General came out in favour of a rather broad territorial scope of de-referencing. Specifically, according to the opinion, once a right to be forgotten within the EU has been established, the search engine operator should take all measures available to it to ensure full and effective de-referencing within the EU, including by use of ‘geo-blocking’ in respect of an IP address located in the EU, irrespective of the domain name used by the internet user.

Concluding thought

The opinions of the Advocate-General come at a time of a heated debate about the application of the European data protection framework following its recent reform. Both the right to be forgotten and the territorial scope of act have been exhaustively discussed in the legislative process leading to the adoption of the GDPR. As usual, the judgment of the Court of Justice is awaited with interest. This time, however, it will reveal not only whether the CoJ shares the view of its advisor, but also to what extent the interpretation eventually provided affects the framework applicable today.

Wednesday, 15 July 2015

Reading tip: Guardian article on Google "right to be forgotten" requests

Our readers will remember that since last year's ECJ decision in Google Spain, it is possible for individuals who think certain information concerning them should not be featured among Google's search results to demand the search engine to put a filter in place by filling in a simple form.

The Guardian has perused Google's transparency report from last year and found some interesting information as to the way this possibility has been made use of so far. The ensuing article really makes a worthy read.  

Tuesday, 13 May 2014

Google as data controller and right to be forgotten - CJEU in Google Spain (C-131/12)

13 May 2014: CJEU judgment in Google Spain (C-131/12)

We have previously discussed the opinion of AG Jääskinenin the Google Spain case, where he argued that an online service provider like Google (providing search engine services) should not be considered to fall under the definition of a data controller, meaning that it would not be obliged to comply with the data protection requirements and control what data is revealed through its search results. (see No forgetting...) Interestingly, in today's judgment the CJEU takes a different stand on these issues.

"According to Google Spain and Google Inc., the activity of search engines cannot be regarded as processing of the data which appear on third parties’ web pages displayed in the list of search results, given that search engines process all the information available on the internet without effecting a selection between personal data and other information. Furthermore, even if that activity must be classified as ‘data processing’, the operator of a search engine cannot be regarded as a ‘controller’ in respect of that processing since it has no knowledge of those data and does not exercise control over the data." (Par 22)

While, Google Spain claimed that it should not be seen as either a data controller or a subject of data protection rules, the CJEU disagreed, mentioning that already previously loading of personal data on a website that was considered as falling under the data processing definition from the Data Protection Directive. (Par. 26)

"Therefore, it must be found that, in exploring the internet automatically, constantly and systematically in search of the information which is published there, the operator of a search engine ‘collects’ such data which it subsequently ‘retrieves’, ‘records’ and ‘organises’ within the framework of its indexing programmes, ‘stores’ on its servers and, as the case may be, ‘discloses’ and ‘makes available’ to its users in the form of lists of search results. As those operations are referred to expressly and unconditionally in Article 2(b) of Directive 95/46, they must be classified as ‘processing’ within the meaning of that provision, regardless of the fact that the operator of the search engine also carries out the same operations in respect of other types of information and does not distinguish between the latter and the personal data." (Par. 28)

This finding cannot be contradicted by a claim that personal data has already been published somewhere else online and not changed by the search engine. (Par. 29, 31) Moreover, since the definition of data controller in the Directive should be broadly understood, the fact that Google may not have control over what's posted on other websites doesn't exclude it from under this definition. (Par. 33-37)

The CJEU also reminds the need to protect both private life and personal data, since these are among the fundamental rights mentioned in the Chapter. (Par. 69) While the data subject may request revision or removal of his personal data from the search engine's results, this request for data protection can clash with the freedom of expression and other people's right to information.

"In the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life." (Par. 81)

Keeping this in mind, the CJEU then decides that "the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.". (Par. 88)

Finally, the CJEU recognizes the "right to be forgotten". The data controller, like Google may need to remove data that originally was published lawfully, but which became with time "no longer necessary in the light of the purposes for which they were collected or processed. That is so in particular where they appear to be inadequate, irrelevant or no longer relevant, or excessive in relation to those purposes and in the light of the time that has elapsed." (Par. 93)

For the applicability of the Directive to Google's services, please read the judgment further (replies to question 1) where the CJEU debates the matter of, among others, establishment's definition. (Par. 45-61)

This is an interesting decision that might lead to some practical difficulties in its application. Imagine that consumers start now asking Google to remove various links leading to websites containing information about them en masse. How long would Google have to react to these requests/demands, when would it be able to refuse to remove a link from a search engine (who decides whether the content on that website was lawfully published or stopped being relevant etc.?), etc?

Monday, 27 January 2014

Data protection reform full speed ahead

On the eve of this year's Data Protection Day, the European Commission takes stock of the progress of the reform of the EU's legal framework for data protection that was set in motion two years ago (see our post 'EU data protection reform announced'). In a comprehensive press release, the Commission sets out the time frame for adoption of the proposed Data Protection Regulation and Directive (a possible agreement before the end of 2014); benefits for citizens, businesses and SMEs; the reform's foreseen impact on the Internal Market and on scientific research; and the meaning of the right to be forgotten. Furthermore, the Commission pays attention to the EU's response to allegations of surveillance of EU citizens by US intelligence agencies.

Vice-president Viviane Reding, who will give a speech at the Centre for European Policy Studies tomorrow, adds:

'Data protection in the European Union is a fundamental right. Europe already has the highest level of data protection in the world. With the EU data protection reform which was proposed exactly two years ago – in January 2012 – Europe has the chance to make these rules a global gold standard. These rules will benefit citizens who want to be able to trust online services, and the small and medium sized businesses looking at a single market of more than 500 million consumers as an untapped opportunity. The European Parliament has led the way by voting overwhelmingly in favour of these rules. I wish to see full speed on data protection in 2014.'

See also DG Justice's Data Protection website.

Tuesday, 2 July 2013

No forgetting - Opinion of AG Jääskinen in case C-131/12 Google Spain

While the internet makes possible the fast distribution of information (which, for instance, allows me to trace my co-authors' steps in Australia through their posts on this blog), it also makes people vulnerable to infringements of their privacy in case too much information is or remains available online. Since the word 'google' became a verb, cases in which a request is made to have certain data removed from websites have arisen with increasing frequency. A recent example is that of Google Spain v the Spanish Agency for Data Protection (AEPD) and Mario Costeja González, in which a number of questions have been referred to the Court of Justice of the EU for a preliminary ruling. Last week, Advocate General Jääskinen handed down his Opinion in this case.

The case concerned references to a Spanish newspaper article about a person having been involved in a real-estate auction because of his social security debts. Arguing that these proceedings had ended years earlier and were no longer of relevance, this person requested Google to make sure that no references to the newspaper article would appear when his name and surnames were entered in the Google search engine. The Director of the Spanish Data Protection Agency upheld his claim against Google Spain and Google Inc., requiring the search engine service providers to take the necessary measures to withdraw the data from their index and prevent further access to these data. Both Google and Google Inc. appealed against this decision. The Spanish National High Court hearing their case referred a number of questions to the CJEU concerning the application of EU data protection law.

AG Jääskinen considers that search engine service providers are not responsible, on the basis of the Data Protection Directive, for personal data appearing on web pages they process. An important reason for this is that the Directive should not be considered to establish a general 'right to be forgotten'. In Jääskinen's opinion, such a right cannot be invoked against search engine providers on the basis of the Data Protection Directive, even when the Directive is interpreted in accordance with the Charter of Fundamental Rights of the EU. In other words, the AG seeks to discourage the CJEU from establishing a 'horizontal effect' of the right to be forgotten, that is: its application to the relationship between the search engine service provider and individuals whose data show up in searches. His opinion seems to be based, in particular, on the fact that the publisher's right to freedom of expression deserves protection as well, and that the task of striking a balance between privacy protection and freedom of expression should not be assigned to search engine service providers on a case-by-case basis. Rather, Member States should provide effective remedies against the infringement of privacy by web publishers.

See the press release for a brief summary of the AG's opinion.

Tuesday, 8 January 2013

Strengthening data protection in the EU

Today, two draft reports (by Albrecht and Droutsas) were presented in the European Parliament on the reform of the EU's data protection rules (as proposed a year ago by the European Commission). Both reports are positive and support a "a coherent and robust data protection framework with strong end enforceable rights for individuals". (Commission welcomes European Parliament rapporteurs' support for strong EU data protection rules) This support is given both to the objectives set by the European Commission in establishing a new framework on data protection, as well as to the suggested package approach (substantive and procedural rules tackled at the same time). 

There are, of course, also changes that the rapporteurs suggest, such as reinforcement of the right to be forgotten and other individuals' rights (strengthening of explicit consent by making the language easier in privacy policies, anonymous use of data etc.). They also mention the need to replace the Directive with a Regulation as well as the need to establish an independent EU data protection agency that could take legally binding decisions, which would facilitate enforcement of these provisions. 

The LIBE Committee of the EP will discuss these reports on 10 January.

Wednesday, 20 June 2012

Outdoing Huxley

Data protection is high on the agenda of the European Commission, as is attested by the pending proposals for reforming the existing EU data protection framework (see earlier posts on this blog 'EU data protection reform' and 'EU data protection reform announced').

Last Monday, Commissioner Reding gave a speech on the topic at the Digital Enlightenment Forum in Luxembourg: 'Outdoing Huxley: Forging a high level of data protection for Europe in the brave new digital world'. She set out the Commission's aims in the field and explained the background to the proposals:

'Control of every movement, every word or every e-mail made for private purposes is not compatible with Europe's fundamental values or our common understanding of a free society. This is why the Union's Charter of fundamental rights recognises both the right to private life in Article 7 and the right to the protection of personal data in Article 8. But this is not all: Article 16 of the Treaty on the Functioning of the European Union also gives the European Union the legislative competence to establish harmonised EU data protection laws that apply to the whole continent and that make the right to data protection a reality. Data protection is thus one of the rare fields where we have full coherence between the fundamental right and the EU’s legislative competences of the EU. This makes data protection a particularly powerful fundamental right in the European Union, and the Commission’s proposals from 25 January have been designed to put this right into practice everywhere in our internal market.'

Reding's speech replied to some points of criticism that the proposals encountered so far (e.g. from the European Data Protection Supervisor, see 'Critical look at the new data protection rules'). Regarding possible problems related to the enforcement of the proposed rules, she observed:

'In the interest of legal certainty and of fair competition, we have introduced a one-stop-shop system.

For the consumer, this means that they will always turn to their national data protection authority when they have a problem with a company – no matter where the company is based. They will not have to labour through the process of contacting authorities in different EU countries, riddled as it is with problems of different languages or procedures. We make things easy for the consumer.

The same un-bureaucratic one-stop-shop exists for companies as well. They will only have to deal with one data protection authority: in the country in which they have their main establishment. This cuts costs while increasing legal certainty.'

On the point of possibly conflicting fundamental rights, such as the rights to privacy and data protection (incl. a 'right to be forgotten') v freedom of the press, Reding added:

'We are thus allowing Member States to create rules to reconcile the right to the protection of personal data with the rules governing freedom of expression.

This is certainly a difficult balancing act, and one that can only be achieved in the knowledge of the specific details of each individual case and the specific national circumstances. In short, the right to be forgotten is not an absolute right, it is a relative right. Like the general right to privacy, it is a right that needs to be reconciled with other rights which are also protected by the EU's Charter of Fundamental Rights.'

See the website of the Digital Enlightenment Forum for more information and the programme of the meeting.

Analogue enlightenment - Luminara di San Ranieri, Pisa, 16 June 2012