Showing posts with label social media. Show all posts
Showing posts with label social media. Show all posts

Thursday, 30 November 2023

BEUC and NOYB oppose Meta's pay-or-consent model

Freepik

I am sure you have noticed that in early November, Meta launched paid subscriptions for its social media. Now you may choose to stop receiving targeted advertisements on Facebook and Instagram under one condition -  you have to pay €9.99/month on the web or €12.99/month on the iOS and Android versions of the apps. Of course, without payment, you can still use the services, but then you have to accept the personalised advertisements, which means you accept that your data is processed for this purpose. This Meta policy is the result of various disputes with European institutions and national supervisory authorities related to Meta's practices of processing users' personal data (including the July ruling in case C-252/21 where the CoJ criticised some of Meta's illegal approaches to personal data)*.

The very announcements of paid subscriptions have already triggered a wave of criticism. So it didn't take long for the first steps to challenge the legitimacy of the Meta's actions. A few days ago NOYB, which is a non-profit organization led by privacy activist Max Schrems, announced that it filed a GDPR complaint against Meta over "Pay or Okey". According to NOYB, such a "privacy fee" is not only illegal, since you cannot be forced to pay for exercising your fundamental right to privacy, but moreover, it risks having a domino effect and being taken over by other leading players in the digital services market as well. 

But this is not the only step against Meta's new practice. Today BEUC, which is a European Consumer Organization, also has voiced its opposition to this practice, stating that it is "an unfair choice for users, which runs afoul of EU consumer law on several counts and must be stopped". Thus, BEUC together with its 19 members filed a complaint on grounds of Meta engaging in unfair commercial practices in multiple ways. As BEUC stated, partially blocking the use of Facebook and Instagram until users have selected one option or the other constitutes an aggressive practice under European consumer law. What is more, opting for the paid subscription doesn't guarantee that a user gets a privacy-friendly option involving less tracking and profiling - user's personal data still may be collected and used but for purposes other than ads. More detailed assessment of Meta's subscription model you can find here

It remains to be seen how these actions will affect the Meta approach in the future. One thing is certain - the story will have its continuation, perhaps before the Court of Justice.

*The Court, inter alia, questioned Meta's legal grounds for processing personal data for personalization purposes, i.e. Article 6(1b) of the GDPR (the necessity of processing data for the performance of a contract), and Article 6(1f) of the GDPR (the processing of data on the basis of legitimate interests of the controller or a third party) - see paragraphs 97-126 of the ruling. 

Sunday, 27 June 2021

One-stop-shop mechanism of the GDPR clarified by the Court of Justice (case C-645/19 Facebook Ireland)

Last week the Court of Justice delivered an important judgment in case C-645/19 Facebook Ireland. The case offered an opportunity to clarify procedural aspects of the General Data Protection Regulation 2016/679. In particular, it involved topical problems related to the one-stop-shop mechanism provided for in case of cross-border data processing. The GDPR assigns the role of a "lead authority" in this context to the supervisory authority of the main establishment or of the single establishment of the relevant controller or processor. Since many digital companies undertaking large-scale data processing in the EU have their main establishments in Ireland, it is the Irish Data Protection Commissioner that appears to be a lead authority in respective cases. Over past years, however, the authority has come under strong criticism for failing to effectively act on complaints brought before it and particulars of the one-stop-shop mechanism have been a subject of debate (see e.g. our previous posts €50m fine imposed on Google..., Further updates on consumer protection..., BEUC files complaints against Tik Tok). In case C-645/19 Facebook Ireland, the Court addressed some of the issues raised, clarifying when authorities of other Member States are competent to exercise their powers.
 
Facts of the case
 
The background of the case was primarily procedural. The Belgian authority brought a case before a Belgian court against Facebook Ireland, Facebook Inc. and Facebook Belgium, with the aim to bring an end to the collection of information on the internet browsing behaviour of Facebook users and non-users by means of cookies, social plug-ins, pixels, etc. The court of first instance considered itself competent to give a ruling and confirmed the alleged infringements. The trader brought an appeal against the judgment and a question was raised whether the Belgian supervisory authority had the required standing and interest to bring proceedings in the first place, and if so, in relation to which violations (e.g. committed by Facebook Inc., Facebook Belgium, and/or Facebook Ireland; before and/or after 25 May 2018, that is the date on which the GDPR and its one-stop-shop mechanism became applicable). 

Judgment of the Court
  
Since the judgment is rather technical, the present post does not aspire to provide its comprehensive overview. An interested reader is rather advised to consult the judgment directly. In this post we will rather pick up on selected points, partly in a different order from the one adopted by the Court.
 
Firstly, the Court engaged with the argument put forward by the platform operator that the legal action concerning the facts precedings 25 May 2018 was inadmissible, given that the previously applicable provisions of Belgian law were repealed following the entry into force of the GDPR. The Court addressed this problem from the perspective of EU law, finding that a supervisory authority which brought an action related to cross-border processing taking place before 25 May 2018 may continue to pursue such an action on the basis of the previously applicable Directive 95/46 (para. 105). Put differently, the one-stop-shop mechanism established in the GDPR does not stand in way of the proceedings by different DPAs in relation to violations preceding the GDPR's date of application. The Court, however, did not engage with the interpretation of previously applicable Directive 95/46, and the question whether its provisions on supervisory authorities could be deemed to have direct effect. By contrast, such an effect was confirmed in relation to the relevant provisions of the GDPR (para. 113).

Arguably, the most interesting part of the judgment concerns the one-stop-shop mechanism itself (the first question). This is where the judgment gets particularly technical, the reasoning is intertwined with extensive references to GDPR provisions and appears to often change direction. Ultimately, para. 71 and the following deserve particular attention. Here the Court finds that the exercise of the power by a non-lead authority to bring actions before the courts of its state cannot be ruled out in the following situations. Firstly, this is the case when the mutual assistance of the lead supervisory authority had been sought under Article 61 of the GDPR and the lead authority did not provide the other authority with the requested information. Secondly, under Article 64(2) of the GDPR, a supervisory authority may request that any matter that is of general application or that produces effects in more than one Member State be examined by the European Data Protection Board with a view to obtaining an opinion, in particular where a competent supervisory authority does not comply with the obligations for mutual assistance imposed on it by Article 61 of the GDPR. Following the relevant procedure (that is, if the EDPB approves), the supervisory authority should be able to exercise the power conferred on it by Article 58(5) of the GDPR and take the necessary measures to ensure compliance with the GDPR.

The remaining part of the judgment involved the potential additional prerequisites for the exercise of the power by a national authority other than lead authority in the above described cases; specifically, whether the actions of such non-lead DPAs should be limited to the controllers having a main establishment or another establishment on their territory. The Court looked at this problem from a twofold perspective and opted for a reading that does not significantly restrict the powers of such non-lead authorities (paras. 84 and 96). Put differently, it remains theoretically possible that a non-lead Belgian authority initiates or engages in legal proceedings against a company like Facebook Inc.
 

Tuesday, 16 February 2021

BEUC files complaints against Tik Tok

 Today, BEUC and 15 national consumer associations have filed coordinated complaints against Tik Tok, a platform popular with older children and teenagers. The complaint joins items related to purported violations of the Unfair Terms Directive, the Unfair Commercial Practices Directive and the GDPR. 

Part of Tik Tok's business model is relatively specific - namely, the platform sells token that can be used to "give presents" to other users as a reward for the performances shown in their videos. BEUC complains that several of the terms regulating these tokens are in fact non-transparent and substantively unfair. The complaint seems to connect nicely to the recent European Parliament report by our very own Joasia Luzak and colleague Marco Loos calling for closer scrutiny of terms of service for online providers

The platform's young target audience is an important driver behind some of the concerns: hidden advertisement and harmful content are a particularly sensitive issue when minors are involved. Other complaints, such as Tik Tok's nonchalant approach to control over user generated content (see earlier news of users finding themselves included in ads without any previous knowledge or consent from their side) seem to reflect more general concerns with privacy and user control/copyright.

The choice to build up such a large coordinated complaint may be an attempt to circumvent earlier identified problems with enforcing the GDPR against Tik Tok - namely, the lack of clarity as to which authority would be competent under the Regulation's "one-stop-shop" rule. This problem was only partially addressed when the company decided to establish its data processing activities in Ireland, which is also the European data hub for many other tech companies. 

Thursday, 4 February 2021

CMA's paper on algorithms & online platforms: comprehensive report on benefits and perils of AI regulation

The UK Competition and Market’s Authority recently published a report on the consequences of the online platforms’ use of algorithms (‘sequences of instructions to perform a computation or solve a problem’) for consumer protection and for competition (here). This report builds on the CMA’s 2018 paper on pricing algorithms (here). The report starts by highlighting that the increasing sophistication of algorithms usually means decreasing transparency. The CMA’s report acknowledges the benefits of algorithms to consumers, such as the possibility to save consumers’ time by offering them individualized recommendations. Additionally, algorithms benefit consumers by increasing efficiency, effectiveness, innovation and competition. However, the main goal of the report is to list (economic) harms caused to consumers as a result of algorithms.

The report highlights that big data, machine learning, and AI-based algorithms are at the core of major market players such as Google (e.g. their search algorithm) and Facebook (e.g. their news’ feed algorithm). The CMA also acknowledges that many of the harms discussed in this report are not new but were made more relevant by recent technological advances. Finally, the report acknowledges that the dangers brought by algorithmic regulation are even greater where it impacts consumers significantly (such as decisions about jobs, housing or credit).

The harms discussed in the report deal mainly with choice architecture and dark patterns (e.g. misleading scarcity messages on a given product or misleading rankings). Additionally, personalization is depicted as a particularly dangerous harm, since it cannot be easily identified and because it manipulates consumer choice without that being clear to consumers. Personalization is also worrying because it targets vulnerable consumers. In particular, the CMA is worried about possible discrimination as a result of personalization of offers, prices and other aspects.

Personalized pricing implies that firms charge different prices to different consumers according to what the firm (and their algorithms) think that the consumer is willing to pay. While this has some benefits – like lowering search costs for consumers, the CMA warns that consumers might lose trust in the market as a consequence of personalized pricing practices. While some personalized pricing techniques are well-known – such as offering coupons or charging lower prices to new customers, others are more opaque and harder to detect. Non-price related personalization is also described as potentially harmful, such as personalized search results rankings or personalized recommendation systems (e.g. what videos to show next). In particular, the CMA warns that these systems may lead to unhealthy overuse or addiction of certain services by consumers and to a fragmented understanding of reality and public discourse.

Additionally, the use of algorithms harms competition since it can exclude competitors (e.g. through platform preferencing, via ranking, of their own products). Through exclusionary practices, dominant firms can stop competitors from challenging their market position. A prominent example of this is that of Google displaying its own Google Shopping service in the general search results page more favorably than competitors that offer similar services. Finally, the CMA report zooms in on algorithmic collusion, or the use of algorithmic systems to sustain higher prices.

The report also highlights the obstacles brought by lack of transparency, particularly when it comes to platform oversight. The CMA warns that this lack of transparency and the misuse of algorithms may lead consumers to stop participating in digital markets (e.g. deleting social media apps). This justifies, in the CMA’s opinion, the regulators’ intervention. In particular, the CMA considers that regulators can provide guidance to businesses as to how to comply with the law or to elaborate standards for good practices. Overall, the report brings attention to the fact that many laws in place do not apply to algorithmic regulation, such as to discrimination in AI systems. Moreover, the CMA highlights that the application of consumer law to protect consumers against algorithmic discrimination is still an unexplored area.

The report ends with a call for further research on the harms caused by algorithmic regulation. The CMA suggests techniques to investigate these harms that do not depend on access to companies’ data and algorithms, such as enlisting consumers to act as ‘mystery shoppers’ or through crawling or scraping data from websites. The CMA also suggests specific investigation techniques when there is access to the code.

Overall, this is an extremely comprehensive report that not only explains the biggest consumer harms brought by AI regulation but also contains several practical examples, as well as concrete methodological suggestions for further research and for better enforcement. Definitely a recommended read for both academics and practionners alike.

Tuesday, 26 May 2020

Facebook ventures further into social commerce: implications for consumer protection

GUEST POST BY
Dr Christine Riefa, Reader, Brunel University
@cyberchristine

Facebook has announced the launch of Facebook Shops on 19 May 2020, a feature primarily aimed at small businesses wanting to sell online. While this is announced as a solution to help during the pandemic, the move had been on the cards for a while (starting with the launch of libra, as a cryptocurrency in 2019). Yet, this launch comes at a time where many shops had to close during the pandemic and are trying to find viable solutions to continue sales. This also comes amid the backdrop of a surge in the uptake of online commerce during lockdowns around the world.

So far, sales on Facebook were limited to the use of marketplace. The Facebook marketplace only enabled users to post adverts and sellers to send direct messages with a view to conclude a sale but it did not support online payments. Marketplace was primarily built for C2C sales (although it was also used by some small businesses). Facebook Shops will drastically change this. It is billed to rival amazon and Etsy in capturing the online e-commerce market. This follows on from other social commerce ventures by Facebook on other platforms it owns, notably on Instagram. On Instagram, users can make use of a ‘shop now’ button (although this functionality is reserved to a small selection of partners). The ‘shoppable posts’ allow consumers to click on featured items and purchase without leaving the Instagram platform.

The Facebook Shops feature will enable payments to be taken and retailers to set up shops available from both Facebook and Instagram. The service will be free for businesses to use as Facebook relies on advert sales to make the venture profitable. The system also allows retailers to link to third party platforms to manage inventories. It promises to make social commerce seamless, a quality it has so far lacked, mostly because payment solutions did not exist to integrate with this new selling method.

The arrival of this new offering seems to cement the rise of social commerce as a new retail channel. Up to date, social commerce (i.e., social media tools and interactive technologies used in an electronic commerce setting) was developing but remained embryonic. Facebook’s move may well finally launch social commerce for good.

This raises some important questions for consumer protection. Most of the legislation adopted to frame online purchases has focuses on electronic commerce. As social commerce is not simply transactional, and it also builds on a rich social, interactive and collaborative shopping experience (see Yang (2015) 24 Retailing Consumer Serv.) many of the rules in place may not totally be adapted. After all the Facebook Shops is looking for people to ‘experience the joy of shopping versus the chore of buying’ (see https://about.fb.com/news/2020/05/introducing-facebook-shops/). Yet, consumer law has primarily developed based on the information paradigm. This implies that buying is more akin to a chore where the ‘average consumer’ is expected to do his homework and arrive at sound purchasing decision. It requires time spent on the small print, on studying the suitably of a product to ones’ need. As a result, this shift of emphasis as announced by Facebook for its new social commerce offering comes to question some of the underlying rationale for legislation and established policy direction. Besides, consumers will be able to easily share posts about products they are interested in or have purchased, signaling their preferences to their social networks. While Facebook promises this sharing will be at the discretion of the users, other aggregated data on browsing will be collated and shared with the businesses, as well as influence the selection of adverts a consumer may see (https://about.fb.com/news/2020/05/privacy-matters-facebook-shops/). This raises some questions relating to freedom of choice, when big data effectively comes to frame those choices and may also lead to some framing of prices (through price personalization).

This leads to reflect on whether or not, consumer law in its current form is fit for purpose and can serve consumers in their social commerce experiences. There are currently a number of pervasive legal issues associated with social commerce:
-       Legal identification of traders in a social commerce context;
-       Online reviews and notably fake reviews and endorsements.
-       Personalised advertising based on data gathered on social media
-       Potential for personalised pricing that may prove discriminatory and/or cause detriment by artificially raising the price of goods offered
-       Control of digital influencer marketing
-       Sale of fake and/or dangerous products on social media platforms
-       Controlling sales and enforcement of the law across geographical boundaries
-       Regulation of liability on social commerce platforms.

As social commerce becomes more mainstream, those questions will need to find an urgent answer. The danger is of course that while consumers may have learnt to be weary of retailers’ ability to inflate the truth about their product they are less suspicious and potentially more easily influenced in situations where a product is marketed and sold via the intermediary of influencers, or when a product is posted by someone in their social network. In this context, already failing underpinnings of information as a shortcut for protection, inflated expectations placed on consumers to behave as rational economic agents, underperforming public enforcement alongside an absence of platform liability may well all line up to create consumer detriment on a large scale.

Notes:
This blog post builds on previous research published by the author. Notably, see C. Riefa, Beyond e-commerce: Beyond e-commerce: some thoughts on regulating the disruptive effect of social (media) commerce (Alèm do comércio eletrônico: algumas reflexōes sobre a regulação dos efeitos maléficos do comércio social (mídia), Revista de dereito do consumidor RDC (Brazil) 127 (Jan-Feb 2020), 281-304, available at SSRN: <http://ssrn.com/abstract=3608016>; C. Riefa, ‘Consumer Protection on Social Media Platforms: Tackling the Challenges of Social Commerce’ in T. Synodinou, Ph. Jougleux, Ch. Markou., Th. Prastitou, EU Internet Law in the Digital Era (Springer, 2019);
C. Riefa, L. Clausen, Towards Fairness in Digital Influencers’ Marketing Practices 8 (2019) 2 EuCML 64-74, available at SSRN: <https://ssrn.com/abstract=3364251>.

Sunday, 6 October 2019

Monitoring duties of online platform operators before the Court - case C-18/18 Glawischnig-Piesczek

Before the summer we briefly referred to the opinion of Advocate General Szpunar is case C-18/18 Glawischnig-Piesczek (see: Recent developments in online content moderation...). Last Thursday, the Court of Justice delivered the judgment in the case, clarifying the interpretation of Articles 15 and 18 of Directive 2000/31/EC on electronic commerce

Source: Pixabay
Background of the case

The case concerned a defamatory comment published on Facebook about a member of the Austrian Greens party, Ms Eva Glawischnig-Piesczek. The politician brought an action against the operator, requesting it to cease and desist from publishing photographs of her if the accompanying text contained allegations identical to those declared illegal or having equivalent content. In doing so she relied on Austrian provisions authorizing the courts to order host providers to terminate or prevent an infringement, in line with Articles 14(3) and 18 of E-Commerce Directive. The referring court, however, run into doubts whether an order to remove or disable access not only to a particular item of information, but also to equivalent items complied with Article 15(1) Directive 2000/31. Pursuant to this provision, Member States shall not impose a general obligation on providers of, among others, hosting services to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. The referring court also wondered about the territorial scope of such an order (for a similar discussion about the right to be forgotten, see: No one-size-fits-all approach to search engine de-referencing...)

Judgement of the Court

The Court gave a comparably broad reading to Article 18 Directive 2000/31 concerning judicial powers to adopt measures designed to terminate alleged infringements and prevent further impairment of the interests involved. According to the Court, Member States enjoy a broad discretion in relation to actions and procedures for taking necessary measures (para. 29). Such a margin of discretion is due to, among others, the rapidity and geographical extent of the damage arising in connection with information society services. Both of these factors were also clearly at play in the present case (para. 36).

Having said that, the Court decided to distinguish between injunctions concerning information whose content is identical to the one which was previously deemed illegal and injunctions concerning information with equivalent content (whose message remains "essentially unchanged and therefore diverges very little from the content which gave rise to the finding of illegality", para. 39).

In the former case, the Court confirmed broad powers of the national court and found that a host provider can be ordered to block access to or remove information with identical content, irrespective of who requested the storage of that information. The injunction granted for that purpose cannot be regarded as imposing on the host provider a general monitoring obligation, but rather concerns the monitoring ‘in a specific case’ (paras. 34, 37). 

When it comes to information with equivalent content the Court sought a balanced solution. It considered that injunctions should generally be able to extend to information, the content of which, "whilst essentially conveying the same message, is worded slightly differently, because of the words used or their combination, compared with the information whose content was declared to be illegal" (para. 41). The objective of an injunction, however, may not be pursued by imposing an excessive obligation on the host provider. To achieve this objective, the injunction must properly identify the specific elements of equivalent information, such as the name of the person concerned, the circumstances of the infringement and equivalent content to that which was declared to be illegal (para. 45). The monitoring of and search for information required of the host provider should be limited to information containing the elements specified in the injunction and be capable of being carried out by automated search tools and technologies (para. 46). Differences in the wording of equivalent content must not, in any event, be such as to require the host provider concerned to carry out an independent assessment of that content.

As regards territorial scope, the Court once again confirmed the broad reading of Article 18(1), Directive 2000/31, which "[did] not make provision ... for any limitation, including a territorial limitation, on the scope of the measures which Member States are entitled to adopt" (para. 49). Following the judgment, therefore, the E-Commerce Directive does not preclude the relevant injunctions from producing worldwide effects. Member States must, nevertheless, ensure that the measures which they adopt take due account of the rules applicable at international level.

Concluding thoughts

The judgment of the Court has multiple implications. Firstly, it strengthens the protection of parties affected by illegal content, but seeks to achieve this without undermining the validity of Article 15. As such, it does not provide for a straightforward solution to each and every future case and sets quite demanding requirements for both national courts and host providers. The former need to define what content they consider to be equivalent to that which had been deemed illegal. How courts will cope with such a task remains an open question. Host providers, in turn, must be ready to to take steps to monitor their platforms for identical or equivalent information, which - as the Court suggests - may require the use of technological tools. The same seems to be true for smaller platforms, even if arguments related to rapidity and geographical extent of the damage may not apply to them with equal force.

The judgment in C-18/18 Glawischnig-Piesczek is clearly relevant beyond the social media context. As noted by Christian Twigg-Flesner in a recent entry, the ruling can also be applied to other platforms like online marketplaces. Operators of such platforms could be required to take steps to monitor their content e.g. as regards the recurring presence of misleading information. The question remains whether the same could also become true for persons engaging in illegal actions.

Finally, attention should be drawn to the brief part of the judgment concerning territorial scope of online moderation. One cannot help noticing the similarity between this question and the one addressed in recent Google case. In Glawischnig-Piesczek, the Court did not provide for an equally balanced framework, but limited itself to stating that injunctions with worldwide effects are not precluded by Directive 2000/31. This remains in line with the opinion of Advocate General Szpunar - notably, the same AG whose advice was followed in the Google case. Both findings are, therefore, not necessarily inconsistent. In fact, the opinion in Glawischnig-Piesczek explicitly refers to the Google case. According to the AG, like with the right to be forgotten, "the legitimate public interest in having access to information will necessarily vary, depending on its geographic location, from one third State to another" (para. 99). Consequently, the limitation of extraterritorial effects of injunctions concerning harm to private life and personality rights, for example by way of geo-blocking, may remain "in the interest of international comity" (para. 100). Whether this is how the Court's reference to "the rules applicable at international level" is going to be read, nevertheless, is far from certain.

Tuesday, 30 July 2019

CJEU in Fashion ID (C-40/17): some consequences of embedding social plugins

Yesterday, the CJEU published its judgment in Fashion ID, a case concerning mainly the notion of "controller" under EU data protection law.

The facts of the case are relatively simple: Fashion ID had placed a "like" button on its website which was connected to Facebook. What Fashion ID's customers may not realise is that - even if they did not use it - the button's presence meant that information concerning them was being transmitted to Facebook. In the proceedings it was uncontested that this information qualified as personal data.

Verbraucherzentrale NRW, a consumer association, brought an injunction against Fashion ID demanding that it abandon such practice. The question whether Fashion ID has any obligations in connection with the data processing - including the duty to inform consumers that their data are being collected and/or require their consent - depends on whether the website is to be considered a data controller.

The referring court doubted whether this is the case since the website operator has no control over the processing of the data transmitted to the plugin provider (para 37).

The Court, in essence, answered that the operator of the website acts as a controller, and is thus responsible for informing the consumer or collecting their consent, insofar as the collection of information and transmission to Facebook is concerned. In particular concerning the collection of the user's consent, the court highlighted that it would not be in line with efficient and timely protection of the subject's rights if the consent would be given only to the second controller, which is involved at a later stage (para 102). Even more strongly, when a customer is not a Facebook user, their data will be processed by the social media operator without them having any direct connection to the latter- which makes the responsibility of the other provider all the greater (para 83).

However, the website operator is not responsible vis à vis the data subjects for any other uses that Facebook itself will make of the data, nor for collecting their consent in that respect (para 102).

While the website has no control on the use of the transmitted data, the purpose of such collection is in part related to the website's benefit as it allows better promotion of its products (para 77-81).

As concerns the collection of data without the subject's consent - ie data that is necessary for the pursuit of a legitimate interest - the court importantly clarified that where both the website and the provider of the social plugin are controllers, they must both be pursuing a legitimate interest for the ground of processing to apply (para 96).

The decision interprets relevant provisions in the "old" Data pProtection directive, which has meanwhile been replaced by the GDPR - but the concepts that it deals with have been kept in the Regulation, so the decision can be transposed to the new rules.

Quite unsurprisingly, the Court rejected Fashion ID's claim that consumer associations would not be entitled to bring any claims under data protection rules - while article 80(2) of the GDPR quite
famously invites MS to set collective enforcement mechanisms, nothing in the previous directive, which only contained general indications on enforcement, can be seen to stand in the way of Member States allowing consumer associations to bring such claims (see in particular paras 57-62).

The Court seems to be aware of the potentially high-profile nature of this case and has accompanied the publication of its decision with a press release

Thursday, 4 July 2019

Recent developments in online content moderation

The discussion about the role of platform operators in content moderation is perhaps as old as online intermediaries themselves. Since the very beginning it involved a delicate balance between conflicting considerations: eg how to protect the freedom of expression, the freedom to conduct a business and the right to an effective remedy while ensuring that intellectual property and personality rights are safeguarded and harmful content does not thrive. The solution established by Articles 14 and 15 of the E-Commerce Directive has continuously been put into test (for the latest installment in the CJEU case law series, see C-18/18 Glawischnig-Piesczek). The approach of choice of the outgoing European Commission has been to keep the legal framework intact for the time being, while pursuing a set of non-legislative initiatives such as recommendations and codes of conduct (eg on hate speech and online disinformation). 

Screenshot of Facebook website
This did not hinder national stakeholders from taking further action. In particular, in 2017 the German lawmaker came up with a new law - the so-called Network Enforcement Act (NetzDG), which imposed a legal obligation on the operators of social media platforms to take down illegal content within the set time limits and report the number of complaints. The act came into force in January 2018 and it has just shown its teeth for the first time: with a 2 million euro fine imposed on Facebook. Interestingly, the decision of the Federal Office of Justice (Bundesamt für Justiz) does not concern the failure of the platform operator to remove illegal content, but rather its alleged non-compliance with transparency duties. According to the German enforcer, the option for making a complaint under NetzDG was harder to find on Facebook than an option for complaining that a post violated the platform’s "community standards". Time will tell whether the decision holds in the appeal proceedings.

Meanwhile, allegations against insufficient blocking and reporting are not the only problems faced by the online platforms these days. Two ongoing legal proceedings in Poland offer an illustrative counterexample. In 2016 a case was brought against Facebook by the president of an association Reduta Dobrego Imienia (Polish League against Defamation) against an alleged overblocking of right-wing content. More recently, Facebook was yet again sued by a Polish NGO whose content was blocked by the platform operator, this time from the opposite side of the social-political spectrum. The dispute in SIN v. Facebook concerns the blocking of a site providing reliable information about the use of psychoactive substances; not to support their consumption, but as part of a harm reduction strategy. The cases are still pending, the big news so far is the in both cases Polish courts recognized their jurisdiction on the basis of Article 7(2) Regulation 1215/2012.

Monday, 4 March 2019

Online platforms will be online platforms

As we reported in October last year (Combating online disinformation...), the major online platforms operating in the EU (e.g. Facebook, Google, Twitter) signed a Code of Practice against disinformation and promised to do better in controlling for and eliminating fake news. This interest in increasing information transparency was mainly motivated politically - ahead of the elections to European Parliament in May 2019 - but should have an impact also on transparency of consumer information, e.g. by controlling for advertisement placements and blocking fake accounts. That is, provided that the online platforms actually deliver on their commitments. To ensure they do, the Commission obliged them to report monthly on the undertaken actions. The first reports of January 2019 are not really promising though (Commission asks online platforms to provide more details on progress made). Only Google provided data on actions taken in January to enhance scrutiny of ad placements throughout the Member States, however, even with this report the Commission considers not to have been given enough details to fully understand how the undertaken actions combat disinformation.

Thursday, 14 February 2019

German regulator restricts Facebook data sharing

On 07.02.2019, Bundeskartellamt, the German competition regulator, issued a decision against Facebook restricting its processing of user data. 

The Bundeskartellamt points out that Facebook is in a dominant position with a market share of 95%. The closure of Google+, one of the competitors of Facebook, has intensified its dominance. Other companies, such as Twitter or Linkedin are considered to only operate in part of Facebook's market.

The decision states that the way Facebook collects, merges and uses data between its subsidiaries ammounts to abuse of dominant position, under competition law. One of the most troubling practices employed by Facebook is that it collects third-party data on users in an almost unlimited way and attaches all of these data to the users' facebook accounts. Data is being collected not only by other Facebook owned services, but by any website that has an embedded facebook button. It is worth noting that the data of the users was collected even if they would not interact with the facebook buttons (even if they  didnt 'like' a page).

What is even more concerning is that data is collected even if there is no kind of facebook sign on the page, when the website is using facebook analytics. This widespread collection of data allows facebook to form very detailed profiles of its users. 

With its decision, the Bundeskartellamt forbids this practice. Facebook, Instagram and Whatsapp will still be able to collect data on their users. However, Facebook will be prevented from assigning this data to a single facebook account, unless they have the voluntary consent of the users. However, the consent of the users is already required for third-party websites. The decision requires Facebook to make changes to its terms of service and data processing. The processing of data from third parties without the consent of users needs to be substantially limited. Facebook will have to come up with proposals on how to achieve that.

This decision comes after the publication of the first reports on the Code of practice against disinformation, signed by Facebook and other large online companies such as Google, Twitter and Mozilla. Facebook has to strengthen its commitments to empower consumers and boost cooperation with fact-checkers. However, if Facebook is serious about making their platform a fertile ground for those who seek to spread disinformation, it should first and foremost protect its users and their data from those who want to abuse them.

The decision is not yet final, as Facebook will have one month to appeal in German courts. It remains to be seen whether Facebook will challenge the decision. This decision serves to point out the increasing intersections between consumer law, data protection law and competition law. The Bundeskartellamt points out that their investigation required close cooperation with data protection authorities.

This is the dawn of a new age where the traditional compartmentalisations of law may not serve us as well  as in the past. Consumer law will also have to adapt in order to address challenges arising from novel business models, and especially in relation to data protection.