Sunday 2 February 2020

Data protection (violations) by default: stakeholder views and new developments in enforcement

Last weeks brought some interesting new developments in the implementation of the EU rules on data protection, such as the conditions for a valid consent to the processing of personal data and the principles of data protection by design and by default. As we observed numerously on this blog, the developments in data protection are of direct relevance to consumer law and policy, considering that business practices in the digital economy are often connected to the processing of consumer data and, as such, can come within the purview of both fields.

One of the major topic in the ongoing data protection debate concerns default settings. As readers may recall, several months ago we reported on the judgment of the Court of Justice in case C-673/17 Planet49 (CJEU confirms stricter requirements for valid cookie consent...). The case confirmed that - just like in the GDPR - consent referred to in Articles 2(f) and 5(3) of the E-Privacy Directive cannot validly be obtained by way of a pre-ticked checkbox, which the user must deselect to refuse his or her consent. 

Pre-ticked checkboxes and similar mechanisms of collecting consumers' "consent" by default are unfortunately still very present in the digital market. Furthermore, by applying the so-called dark patterns businesses can steer consumer behaviours in the direction they desire, even without the use of default settings (for an illustration see: Google tracks every step you take). Fortunately, practices of this kind not only attract attention of consumer organisations, but are also gradually engaged with by the law enforcers. Last week a higher regional court in Germany - Kammergericht Berlin - ruled on the case brought against Facebook by the national association of consumer organisations (vzbv). The case concerned a total of 26 alleged violations of consumer and data protection law, many of which were confirmed by the court. Default "consent" to location tracking, sharing a link to the users’ profile with search engines and the use of name and profile picture for commercial purposes have all been found to violate the applicable rules on data protection. By contrast, Facebook’s marketing claims that its services "are free and always will be" have not been considered misleading under national provisions implementing the UCPD.

On the latter point, one which turns around the question whether or not personal data constitutes a price, the emerging court practice is not entirely coherent. Just two weeks before the Berlin ruling, the Administrative Court in Lazio (Tribunale Amministrativo Regionale) partially upheld the decision of the Italian Competition and Market Authority (Autorità Garante della Concorrenza e del Mercato, AGCM) which considered an analogous slogan, directed at Italian users, to qualify as an unfair commercial practice. The AGCM has meanwhile launched proceedings against Facebook for the company's non-compliance with the prior decision.

All of this comes at a time of a broader discussion about the interplay of data protection law and consumer law and the application of the - often broadly framed - provisions of both the GDPR and the UCPD. A certain convergence of views appears to be forming between consumer organisations and data protection bodies, even if the relevant overlap is not always complete. It seems that consumer organisations are willing to accept the economic role of data whenever it is beneficial to consumers (like in the case of potentially misleading "free" claims). The European Data Protection Supervisor, however, has been arguing against any direct analogies between data and price, as illustrated by his position on the recent modernisation of the EU consumer rules (and previously on the digital content directive). When it comes to the data protection by design and by default the alignment between the two stakeholder groups seems even stronger. Last November the European Data Protection Board published Guidelines 4/2019 on Article 25 GDPR, which have largely been supported by the association of European consumer organisations - BEUC. The organisation welcomes the operationalisation of both principles, including through the proposed selection of performance indicators as well as the illustrative case studies. Nonetheless, the achievement of effective protection of consumer data in the digital economy has still a long way to go. Limited personal scope of Article 25 GDPR, which only imposes an obligation on controllers, and the lack of clarity on the role and responsibility of developers/processors have been mentioned as the major gaps to be filled.