Showing posts with label digital services act. Show all posts
Showing posts with label digital services act. Show all posts

Wednesday, 26 June 2024

Online safety and "vulnerable" consumers: the new OFCOM's draft codes of practice

The English Office of Communications (OFCOM) has recently released its new draft "Children's safety online" codes of practice in line with the Online Safety Act (OSA) the main act providing for online safety in the UK, which became law in October 2023. 

OFCOM is responsible for enforcing the OSA. Its draft codes of conduct are meant to complement it and to suggest platforms (especially, social media) how to shape a safer environment, with a focus on underage consumers. Accordingly, the guidelines encourage platforms to be stricter in setting up adequate procedures for age-checking, to provide users with the due instruments to report harmful content, to remove it when necessary, and to clarify the systems used to monitor and moderate online content, particularly with respect to young people.

Platforms ought to take measures based on their size, the purpose of their services, and the risk of harm to children. Thus, the Online Safety Act adopts a risk-based approach, similar to the European Digital Services Act (Regulation EU, 2022/2065). The purpose is indeed the same: ensuring that online service providers implement procedures that are able to tackle the threats to online safety while safeguarding users’ privacy rights and freedom of expression. The two acts therefore cover issues such as content moderation, and other generative AI outputs, such as deep fakes. They both show the increasing attention placed by lawmakers on new forms of digital vulnerability, and how to address them.

OFCOM’s choice to enact codes of conduct is in line with the European approach, too; in fact, the EU lawmaker has also emphasized the relevance of codes of conduct and soft laws in shaping a safer digital environment. The draft codes of conduct provide for transparency, placing on platforms the duty to make available to the public their risk-assessment findings, and providing systems to easily report illegal content. 
And the Agency has gone even further. Indeed, it has declared that it could even impose a ban on people under the age of 18 to access platforms that fail to comply with its guidelines. 

However, some questions arise from reading the new OFCOM document. Will the procedures for ascertaining the age of users lead to the collection of an excessive amount of personal data, violating the data minimization principle under the GDPR? Are OFCOM’s codes too restrictive if compared to the DSA, forcing platforms to adopt a “double standard” between users based in the UK, and those that are based in the rest of Europe? Is the Online Safety Act “technologically neutral” enough when differentiating platforms’ obligations on the basis of the content type or the most comprehensive approach adopted by the DSA, based on equal risk mitigation for all the illegal content, is to be preferred?

Despite these concerns, which undoubtedly will be further examined by many scholars and privacy advocates in the coming months, the guidelines seem promising for improving the online safety of English users, especially children. The true test will be their implementation and enforcement.

Friday, 24 May 2024

New action by BEUC: Tamig Temu

We all like a good deal! However, those of us who know (a bit more) about consumer rights and consumer law are aware that cheap goods and services often come at a high price, by infringing our consumer rights.

TEMU, the online marketplace that has gained popularity in the EU, has recently came under the spotlight. This month, BEUC, The European Consumer Organisation, has taken a significant step by initiating an enforcement campaign against TEMU, named 'Taming Temu'. The campaign is a response to TEMU's violation of its consumer protection obligations under the Digital Services Act. The identified breaches include:

  •   failing to provide sufficient traceability of the traders that sell on its platform and thereby to ensure that the products sold to EU consumers conform to EU law.
  •   using manipulative practices such as dark patterns to get consumers, for example, to spend more than they might originally want to, or to complicate the process of closing down their account.
  •   failing to provide transparency about how it recommends products to consumers.

BEUC filed a complaint with the European Commission, while 17 of BEUC’s members filed the same complaint with their competent national authorities. For a more efficient and effective enforcement action, BEUC asks the Digital Services Coordinators of each country (national authorities responsible for enforcing the EU’s Digital Services Act) to transfer the complaints to the Irish authority, TEMU’s country of registration. It would then be up to the Irish authority to take swift action to prevent further consumer harm.

Given the fast growth in the number of TEMU users, it is possible that the platform would pass the threshold of 45 million users per month, which would then classify it as a ‘very large online platform’ and grant the Commission competence to enforce the Digital Services Act.

Given that consumer law enforcement, especially against large platforms, was less effective in the past (see for instance our reports here and here), a concerted EU action is a welcome solution.

Thursday, 24 August 2023

Tomorrow Never Dies: VLOPs debacle

Most of us in the consumer protection field celebrate that as of tomorrow, August 25th, the obligations of the Digital Services Act (DSA) will start binding very large online search engines (VLOSEs) and online platforms (VLOPs). 

By James Yarema on Unsplash
The European Commission designated the first set of VLOPs and VLOSEs on April 25th (see here). To recall, VLOSEs encompass Bing and Google Search, whilst VLOPs are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and Zalando. 

The designation occurred on the basis of the self-reported user data (more than 10% of EU population as active users) and the platforms were given 4 months to start complying with the obligations that the DSA introduced for VLOPs and VLOSEs. These obligations aim to improve transparency (detailed reporting obligations etc), user empowerment (improved content moderation, opt-out from profiling/recommenders systems, enhanced minors protection, bans on advertisements based on sensitive data etc) and facilitate enforcement (via reporting and cooperation obligations).

Unsurprisingly, we have already seen some pushback against these new obligations. Namely, Amazon Store brought an action to the General Court claiming that it should not have been seen as a VLOP (see case T-367/32) and that some of the DSA obligations should not be applicable to it (duty to provide users with an option for reach recommender system that is not based on profiling; duty to compile and publish an advertisement repository). The arguments that Amazon Store brings in are based on the principle of equal treatment and the need to protect Amazon's fundamental rights. The latter is quite ironic, considering that one of the contested obligations, on ensuring that users may opt for not being profiled, aims to protect the fundamental right of users' privacy.

This claim may have just been a strategy from Amazon Store to delay its compliance with the DSA. It is hard to imagine that they would be successful in proving that they are not a VLOP (see also BEUC's commentary on this here). We will follow this case but for now, let us hope that tomorrow brings a positive change!

Sunday, 30 October 2022

Digital Services Act published

For anyone trying to look up the status of the Digital Services Act, please note that it has been published on 27 October 2022 in the Official Journal L 277, page 1

Monday, 22 November 2021

European Data Protection authorities speak up on targeted advertisement

 Dear readers, 

this is a teaching-intensive autumn across European universities - with all the excitement, uncertainty and overall strains of being mostly back in class after over a year of mostly living room lecturing. 

This, however, should not mean that we let crucial developments go unnoticed: last week, in fact, the European Data Protection Board (EDPB) has issued its most resolved opinion yet on the matter of privacy and behavioural tracking. Cookies, in other words - a staple not only of many people's secret kitchen stashes but also of equally elusive locations on our devices. 

The occasion for issuing this opinion is commenting on the Commission's Digital Services Act, which according to the Board should be brought more clearly in line with data protection rules. Couched among guidelines and standpoints on a number of highly salient issues - from counterterrorism to face recognition AI - the EDBP has called for 

1) considering a phase-out of targeted ads based on "pervasive tracking";

2) in any event, prohibiting targeted ads addressed at children.   

The opinion does not expand on the reasons for such standpoint, but mainly refers to previous positions  contained in comments on the DSA by the European Data Protection Supervisor (EDPS) and the European Parliament. In fact, criticism of the current rules' focus on informed consent has been around at least for the better part of the past decade (see for a classic Frederik Borgesius). 

The European data protection board is composed of representatives from the national data protection authorities. As a collective body mirroring positions in the Member States, its position can perhaps have more sway than the occasionally more principled stances of the EDPS. 

Tuesday, 15 December 2020

Who is on the nice or naughty list?: Digital Markets Act and Digital Services Act

European Commission published today two potentially game-changing proposals for the digital market: Digital Services Act (see here) and Digital Markets Act (see here). For those of our readers who have not heard about the works on the proposals for these two regulations, the Digital Services Act aims to 'comprehensively' regulate the obligations of digital service providers towards users of their services as well as enforcement authorities, whilst the Digital Markets Act sets out the rules for the digital market, e.g. defining its largest participants as so-called 'gatekeepers' and providing for additional obligations for them. These proposals will now be discussed further by the European Parliament and the Council. Below we provide a summary of the proposals.

Digital Markets Act

The advantage of the Digital Markets Act for consumers lies definitely in the environment this regulation aims to promote: more competitive, encouraging consumers to seek for, access and conclude the best possible deal, not locking them into digital services contracts they conclude. The gatekeepers to the market are companies located anywhere in the world, provided they offer their core platform services to business users in the EU or end users in the EU (Article 1(2)). The end users do not have to be consumers (Article 2(16)) as they may be legal persons, as well as natural persons.

Core platform services could be (Article 2(2)): online intermediation services, online search engines, online social networking services, vide-sharing platform services, number-independent interpersonal communication services, operating systems, cloud computing services, advertising services. 

Gatekeepers are defined (Article 3) as providers of core platform services who have a significant impact on the internal market (e.g. annual EEA turnover exceeds EUR 6.5 billion in the last 3 financial years and they provide the core platform service in min 3 Member States), operate a core platform service which serves as an important gateway for business users to reach end users (e.g. more than 45 million monthly active end users located in the EU and more than 10.000 yearly active business users established in the EU), and enjoy an entrenched and durable position in its operations (or could do so) (e.g. where it was an important gateway for the last 3 financial years). These thresholds are just an example, as the Commission has the power to designate other providers as gatekeepers (Article 3(6)). Generally, it is the gatekeeper who should notify the Commission that they have reached such a position (Article 3(3)).

Articles 5 and 6 specify the main obligations of the gatekeepers. These seem to focus on: limiting the scope in which platforms use personal data, not limiting a possibility for business users to offer the same products or services to end users on different platforms under different conditions, not restricting communication of business and end users outside the platform, allowing end users to un-install any pre-installed software applications on the platform, as well as facilitate installation of third party software applications, not prioritise in ranking products or services offered by the gatekeeper, limit bundling of services, facilitate switching of services, transparency obligations.

Further provisions of the proposal for this regulation are devoted mainly to various monitoring obligations, review processes and consequences for non-compliance (mainly fines).

Digital Services Act 

The main advertised advantage of this act for consumers is that the provisions of the regulation will offer them 'more choice, lower prices', 'less exposure to illegal content' and 'better protection of fundamental rights' (see here). These benefits are to result if the aims of the Regulation as defined in its Article 1(2) are achieved. Similarly to DMA this Act also will apply to providers of intermediary services regardless of their place of establishment, as long as the services are provided to recipients established or located in the EU (Article 1(3)). A recipient of the service again does not need to be a consumer, as it may be a legal person (Article 2(b)).

The obligations imposed by this Act are to bind providers of intermediary information society services (Article 2(f)), that is services where providers are either a: 'mere conduit' (transmitters of information provided by recipients or providing access to the communication network, e.g. Internet access providers), 'caching' service (where they temporarily store the transmitted information to make its transmission more efficient), 'hosting' service (where they store the information provided for and as requested by the recipient of the service, e.g. cloud services).

Articles 3-5 specify when providers of various information society services in the above-mentioned categories may be exempted from the liability for the content of that information, i.e. when it is illegal content. These provisions aim then to ensure that consumers (citizens) are indeed exposed to less illegal content (will they suffice? that is the question that goes beyond providing a short summary here!).

Importantly, if a provider conducts a voluntary own-initiative investigation that does not change the status of their exemption from the liability as determined in Articles 3-5 (Article 6). Thus self-checks are to be encouraged. However, there remains no general obligation to monitor the information which providers transmit or store (Article 7), similarly to the current E-Commerce Directive provision.

Articles 8-9 regulate how providers are to comply with orders about removing illegal content and providing information about users, respectively.

Other interesting for us provision may be:

- Article 12 - which obliges providers to transparently outline any content moderation procedures and tools in their terms and conditions;

-  Article 13 - requiring providers to publish an annual report on past year's content moderation; Article 23  adds also an obligation to provide reports on the number of disputes submitted to ADR, number of suspensions (Art 20) and any use of automatic means for the purpose of content moderation;

- Article 14 - providers of hosting services are required to facilitate users notifying them about illegal content and allowing them to act upon such notifications (notice and action);

- Article 15 - requires hosting services providers to notify, in a reasoned and transparent manner, users if their content is removed/disabled/etc as illegal;

- Article 17 - sets out rules on an internal complaint-handling system for online platforms, which they are to provide to recipients of services for min 6 months following either removal or disabling of access to the information provided by recipients, suspension or termination of the service, suspension or termination of the account;

- Article 17(4) and Article 18 - recipients of services should be informed about and have access to ADR;

- Article 19 - specifies rules for processing of notifications of trusted flaggers;

- Article 20 - determines when providers could suspend users' accounts due to misuse, i.e. frequent posting of illegal content;

- Article 22 - specifies that online platforms are required to obtain information from traders allowing to trace identity and location of such traders and should check the reliability of this information;

-  Article 24 - online advertising has to be clearly displayed as such, together with parameters used to determine the recipient to whom the advertisement is displayed;

- Articles 26-33 - contain obligations for very large online platforms, in the same categories as described above, but with stricter provisions.

Further provisions pertain to various enforcement and penalties issues, as well as the plans to encourage adoption of standards and codes of conduct.

Tuesday, 12 November 2019

Long live the E-Commerce Directive? First discussions on the Digital Services Act

Last week we informed about the Council's adoption of the first part of the New Deal for Consumers – directive on better enforcement and modernisation of EU consumer protection rules. With the next European Commission soon beginning its mandate, public attention is gradually shifting to the possible new initiatives affecting consumers in the digital market. Political guidelines of the (then-candidate) Ursula von der Leyen, shed light on two major areas: online platforms and artificial intelligence.

In both fields the initiatives are likely to build upon prior developments in the outgoing Commission. To recall, earlier this year the High-Level Expert Group on AI appointed presented Ethics Guidelines for Trustworthy Artificial Intelligence. The guidelines consider many issues of relevance to consumers such as human agency, safety, privacy, transparency, fairness and accountability. So far, however, it remains rather unclear how the guidelines will inform further actions at the EU level. 

By contrast, we are gradually hearing more and more about possible initiatives on online platforms. In this regard, the Commission has so far followed a “problem-specific approach” as illustrated by the targeted amendments to the consumer acquis, audio-visual media law or copyright law, the adoption of P2B regulation as well as multiple soft law measures on tackling illegal content. The E-Commerce Directive has technically remained unaffected, even though the tendency towards more responsibility of platform operators has been quite clear. According to the more recent reports, the Commission under President von der Leyen is expected to step up these efforts under the banner of “Digital Services Act”. The discussion about its shape are also at an early stage, yet it is not excluded that the new approach will turn out to be still more of essentially the same. According to the recent presentation from the Commission to the Council experts, current discussions appear to be centred on strengthening the cooperation between national regulatory authorities and potentially common rules on tackling different types of illegal content (such as hate speech) at the EU level. Consultations are expected to be launched at the beginning of 2020, potentially leading to more concrete proposals by the end of that year. Stay tuned!