Tuesday 15 December 2020

Who is on the nice or naughty list?: Digital Markets Act and Digital Services Act

European Commission published today two potentially game-changing proposals for the digital market: Digital Services Act (see here) and Digital Markets Act (see here). For those of our readers who have not heard about the works on the proposals for these two regulations, the Digital Services Act aims to 'comprehensively' regulate the obligations of digital service providers towards users of their services as well as enforcement authorities, whilst the Digital Markets Act sets out the rules for the digital market, e.g. defining its largest participants as so-called 'gatekeepers' and providing for additional obligations for them. These proposals will now be discussed further by the European Parliament and the Council. Below we provide a summary of the proposals.

Digital Markets Act

The advantage of the Digital Markets Act for consumers lies definitely in the environment this regulation aims to promote: more competitive, encouraging consumers to seek for, access and conclude the best possible deal, not locking them into digital services contracts they conclude. The gatekeepers to the market are companies located anywhere in the world, provided they offer their core platform services to business users in the EU or end users in the EU (Article 1(2)). The end users do not have to be consumers (Article 2(16)) as they may be legal persons, as well as natural persons.

Core platform services could be (Article 2(2)): online intermediation services, online search engines, online social networking services, vide-sharing platform services, number-independent interpersonal communication services, operating systems, cloud computing services, advertising services. 

Gatekeepers are defined (Article 3) as providers of core platform services who have a significant impact on the internal market (e.g. annual EEA turnover exceeds EUR 6.5 billion in the last 3 financial years and they provide the core platform service in min 3 Member States), operate a core platform service which serves as an important gateway for business users to reach end users (e.g. more than 45 million monthly active end users located in the EU and more than 10.000 yearly active business users established in the EU), and enjoy an entrenched and durable position in its operations (or could do so) (e.g. where it was an important gateway for the last 3 financial years). These thresholds are just an example, as the Commission has the power to designate other providers as gatekeepers (Article 3(6)). Generally, it is the gatekeeper who should notify the Commission that they have reached such a position (Article 3(3)).

Articles 5 and 6 specify the main obligations of the gatekeepers. These seem to focus on: limiting the scope in which platforms use personal data, not limiting a possibility for business users to offer the same products or services to end users on different platforms under different conditions, not restricting communication of business and end users outside the platform, allowing end users to un-install any pre-installed software applications on the platform, as well as facilitate installation of third party software applications, not prioritise in ranking products or services offered by the gatekeeper, limit bundling of services, facilitate switching of services, transparency obligations.

Further provisions of the proposal for this regulation are devoted mainly to various monitoring obligations, review processes and consequences for non-compliance (mainly fines).

Digital Services Act 

The main advertised advantage of this act for consumers is that the provisions of the regulation will offer them 'more choice, lower prices', 'less exposure to illegal content' and 'better protection of fundamental rights' (see here). These benefits are to result if the aims of the Regulation as defined in its Article 1(2) are achieved. Similarly to DMA this Act also will apply to providers of intermediary services regardless of their place of establishment, as long as the services are provided to recipients established or located in the EU (Article 1(3)). A recipient of the service again does not need to be a consumer, as it may be a legal person (Article 2(b)).

The obligations imposed by this Act are to bind providers of intermediary information society services (Article 2(f)), that is services where providers are either a: 'mere conduit' (transmitters of information provided by recipients or providing access to the communication network, e.g. Internet access providers), 'caching' service (where they temporarily store the transmitted information to make its transmission more efficient), 'hosting' service (where they store the information provided for and as requested by the recipient of the service, e.g. cloud services).

Articles 3-5 specify when providers of various information society services in the above-mentioned categories may be exempted from the liability for the content of that information, i.e. when it is illegal content. These provisions aim then to ensure that consumers (citizens) are indeed exposed to less illegal content (will they suffice? that is the question that goes beyond providing a short summary here!).

Importantly, if a provider conducts a voluntary own-initiative investigation that does not change the status of their exemption from the liability as determined in Articles 3-5 (Article 6). Thus self-checks are to be encouraged. However, there remains no general obligation to monitor the information which providers transmit or store (Article 7), similarly to the current E-Commerce Directive provision.

Articles 8-9 regulate how providers are to comply with orders about removing illegal content and providing information about users, respectively.

Other interesting for us provision may be:

- Article 12 - which obliges providers to transparently outline any content moderation procedures and tools in their terms and conditions;

-  Article 13 - requiring providers to publish an annual report on past year's content moderation; Article 23  adds also an obligation to provide reports on the number of disputes submitted to ADR, number of suspensions (Art 20) and any use of automatic means for the purpose of content moderation;

- Article 14 - providers of hosting services are required to facilitate users notifying them about illegal content and allowing them to act upon such notifications (notice and action);

- Article 15 - requires hosting services providers to notify, in a reasoned and transparent manner, users if their content is removed/disabled/etc as illegal;

- Article 17 - sets out rules on an internal complaint-handling system for online platforms, which they are to provide to recipients of services for min 6 months following either removal or disabling of access to the information provided by recipients, suspension or termination of the service, suspension or termination of the account;

- Article 17(4) and Article 18 - recipients of services should be informed about and have access to ADR;

- Article 19 - specifies rules for processing of notifications of trusted flaggers;

- Article 20 - determines when providers could suspend users' accounts due to misuse, i.e. frequent posting of illegal content;

- Article 22 - specifies that online platforms are required to obtain information from traders allowing to trace identity and location of such traders and should check the reliability of this information;

-  Article 24 - online advertising has to be clearly displayed as such, together with parameters used to determine the recipient to whom the advertisement is displayed;

- Articles 26-33 - contain obligations for very large online platforms, in the same categories as described above, but with stricter provisions.

Further provisions pertain to various enforcement and penalties issues, as well as the plans to encourage adoption of standards and codes of conduct.