Showing posts with label data sharing. Show all posts
Showing posts with label data sharing. Show all posts

Saturday, 26 February 2022

The long-awaited Data Act proposal finally (officially) published

For several years, the European Union has been developing a new digital policy framework that aims to comprehensively regulate the data space in the EU. One of the EU's policy objectives is to make the data generated by humans nad machines, especially in the context of IoT devices, more accessible, thereby unlocking the enormous but still under-used potential of this data. According to the European Strategy for Data released in 2020, this objective is to be achieved, inter alia, through the adoption of a so called Data Act - a regulation on harmonised rules on fair access to and use of data. A leaked version of this act had been circulating on the Internet since the beginning of February, but it was not until 23.02.2022 that it was officially published by the European Commission. 

Although Data Act is mostly focused on business-to-business and business-to-government data sharing, it is also important for consumer protection in the digital environment. As we can read in the proposal’s explanatory memorandum:


a high level of consumer protection is reinforced with the new right to access user generated data in situations previously not covered by Union law. The right to use and dispose of lawfully acquired possessions is reinforced with a right to access data generated from the use of an Internet of Things object. This way, the owner may benefit from a better user experience and a wider range of, for example, repair and maintenance services. In the context of consumer protection, the rights of children as vulnerable consumers deserve specific attention and the rules of the Data Act will contribute to clarity about data access and use situations. [p. 13]


and


The proposal facilitates the portability of the user’s data to third parties and thereby allows for a competitive offer of aftermarket services, as well as broader data-based innovation and the development of products or services unrelated to those initially purchased or subscribed to by the user. [p.13]


Freepik.com
These assumptions are reflected mainly in the Chapter II of the proposal, which introduce a.o: 

  • obligation to make data generated by the use of products or related services accessible (Article 3);
  • the right of users to access and use data generated by the use of products or related services (Article 4);
  • right to share data with third parties (Article 5);
  • obligations of third parties receiving data at the request of the user (Article 6).

The proposal will now be further debated under the legislative path before the European Parliament and the Council. It will certainly be discussed among the scientific community and consumer organisations. The EC proposals, although at first glance reasonable and necessary, require an in-depth analysis in particular from the perspective of already existing data protection and consumer law. Let us just remind that under the GDPR, data subjects have the right of access to their data (Article 15 GDPR) and the right to data portability (Article 20 GDPR). The effective exercise of these rights is sometimes problematic in practice, for example due to the lack of actual control by the controller over data flows or the lack of interoperability between devices/services, making it impossible to transfer data from one provider to another. It is also important to remember that devices that we use every day as consumers may generate not only data containing personal information (and therefore qualifying as personal data), but also non-personal data of a technical nature, containing valuable information about how the devices function or are used by consumers. At the same time, due to the large volumes of data that are produced in IoT devices and services, the differences between personal and non-personal data are increasingly difficult to grasp. For these reasons, the Data Act is a piece of EU legislation that has been long awaited and much anticipated. We can therefore expect the debate surrounding this act to be very lively and interesting.


Wednesday, 22 December 2021

Invalid consent and illegal sharing of sensitive data - € 6.5 million fine imposed by the Norwegian DPA on Grindr LLC

It would seem that quite strict requirements have been indicated in the General Data Protection Regulation in relation to consent as a legal basis for personal data processing. But even clear-cut conditions (indeed - not always easy to meet) will not force or encourage data controllers to adopt fully compliant practices, especially when the commercial interests are at stake. This time under scrutiny was Grindr - the world’s largest dating app for LGBTQ+ community. Last week the Norwegian Data Protection Authority imposed approximately € 6.5 million fine for several GDPR breaches. 

The main problem concerned the consent mechanism employed in the application. Grindr implemented a model where a user was only asked whether he or she „Cancel” or „Accept” the privacy policy while registering. If the „Cancel” button was chosen, the data subject could not use the app. What is more, users were not asked separately if they wanted to consent to the sharing of their personal data with Grindr’s partners for marketing purposes. They were forced to accept the policy in its entirety in order to use the app - a classical "take it or leave it" situation. And besides, the length of the privacy policy and the variety of information contained therein made it even more difficult to get acquainted with all relevant issues and make a "freely given, specific, informed and unambiguous" agreement to the processing (see: Art. 4(11) of the GDPR). Therefore in the DPA’s view Grindr did not collect valid consent:


"Where the controller has several different purposes for processing personal data, and it does not allow for separate consents to be given, there is a lack of freedom and control for the data subject. If the data subject cannot identify and opt in to the processing purposes for which the data subject wishes to give his or her consent […] there is no genuine free choice or control."(See: pp.17-18 of the decision). 

The DPA underlined also that in the case at hand the provision of behavioural advertisement was not an essential part of the service, and definitely was not the reason why data subjects used the app. Therefore user’s consent cannot be regarded as „freely given”, even if - as Grindr argued - data subjects were informed how to opt-out from data sharing with third parties. However, according to the GDPR, consent should take the form of a statement or a clear affirmative action. There is no doubt that opt-out model does not fulfill this condition. 

The last but not least, in the EU it is generally forbidden to process special categories of data, so called „sensitive data”. Information on sexual orientation is considered as sensitive (as indicated in Article 9(1) of the GDPR) and as such it enjoys a higher standard of protection. In order to process sensitive data a controller must rely on one of the legal basis stipulated in Article 9(2) of the GDPR. Since Grindr did not collect the consents for processing lawfully, it could not lawfully share the data. 

It is not the first and certainly not the last case where the consent mechanism turns out to be far from exemplary. Just for the record - the issue of consent validity in the context of cookies was examined, inter alia, by the Court of Justice in the Planet49 case (C-673/17; reported on this blog here). Despite clear rules referring to the consent as a legal basis for processing, many controllers still look for new ways to optimize the process of obtaining user consents. Some of them accept, consciously or not, to collect consents not necessarily in a manner consistent with the GDPR. Others try to mislead data subjects by showing in their privacy policies or cookie banners, usually in the first information layer, that there is no consent for processing of personal data by default, while in fact the processing takes place on the basis of the legitimate interests of the controller. What other practices will emerge in the future? We do not know yet, but will keep an eye on them.