OFCOM is responsible for enforcing the OSA. Its draft codes of conduct are meant to complement it and to suggest platforms (especially, social media) how to shape a safer environment, with a focus on underage consumers. Accordingly, the guidelines encourage platforms to be stricter in setting up adequate procedures for age-checking, to provide users with the due instruments to report harmful content, to remove it when necessary, and to clarify the systems used to monitor and moderate online content, particularly with respect to young people.
Wednesday, 26 June 2024
Online safety and "vulnerable" consumers: the new OFCOM's draft codes of practice
Friday, 24 May 2024
New action by BEUC: Tamig Temu
We all like a
good deal! However, those of us who know (a bit more) about consumer rights and
consumer law are aware that cheap goods and services often come at a high
price, by infringing our consumer rights.
TEMU, the
online marketplace that has gained popularity in the EU, has recently came under
the spotlight. This month, BEUC, The European Consumer Organisation, has taken
a significant step by initiating an enforcement campaign against TEMU, named 'Taming Temu'. The
campaign is a response to TEMU's violation of its consumer protection
obligations under the Digital Services Act. The identified
breaches include:
- failing to provide sufficient traceability of the traders that sell on its platform and thereby to ensure that the products sold to EU consumers conform to EU law.
- using manipulative practices such as dark patterns to get consumers, for example, to spend more than they might originally want to, or to complicate the process of closing down their account.
- failing to provide transparency about how it recommends products to consumers.
BEUC filed a complaint with the European Commission,
while 17 of BEUC’s members filed the same complaint with their competent
national authorities. For a more efficient and effective enforcement
action, BEUC asks the Digital
Services Coordinators of each country (national authorities responsible for
enforcing the EU’s Digital Services Act) to transfer the complaints to the
Irish authority, TEMU’s country of registration. It would then be up to the
Irish authority to take swift action to prevent further consumer harm.
Given the fast
growth in the number of TEMU users, it is possible that the platform would pass
the threshold of 45 million users per month, which would then classify it as a
‘very large online platform’ and grant the Commission competence to enforce the
Digital Services Act.
Given that consumer law enforcement, especially against large platforms, was less effective in the past (see for instance our reports here and here), a concerted EU action is a welcome solution.
Thursday, 24 August 2023
Tomorrow Never Dies: VLOPs debacle
Most of us in the consumer protection field celebrate that as of tomorrow, August 25th, the obligations of the Digital Services Act (DSA) will start binding very large online search engines (VLOSEs) and online platforms (VLOPs).
By James Yarema on Unsplash |
The designation occurred on the basis of the self-reported user data (more than 10% of EU population as active users) and the platforms were given 4 months to start complying with the obligations that the DSA introduced for VLOPs and VLOSEs. These obligations aim to improve transparency (detailed reporting obligations etc), user empowerment (improved content moderation, opt-out from profiling/recommenders systems, enhanced minors protection, bans on advertisements based on sensitive data etc) and facilitate enforcement (via reporting and cooperation obligations).
Unsurprisingly, we have already seen some pushback against these new obligations. Namely, Amazon Store brought an action to the General Court claiming that it should not have been seen as a VLOP (see case T-367/32) and that some of the DSA obligations should not be applicable to it (duty to provide users with an option for reach recommender system that is not based on profiling; duty to compile and publish an advertisement repository). The arguments that Amazon Store brings in are based on the principle of equal treatment and the need to protect Amazon's fundamental rights. The latter is quite ironic, considering that one of the contested obligations, on ensuring that users may opt for not being profiled, aims to protect the fundamental right of users' privacy.
This claim may have just been a strategy from Amazon Store to delay its compliance with the DSA. It is hard to imagine that they would be successful in proving that they are not a VLOP (see also BEUC's commentary on this here). We will follow this case but for now, let us hope that tomorrow brings a positive change!
Sunday, 30 October 2022
Digital Services Act published
For anyone trying to look up the status of the Digital Services Act, please note that it has been published on 27 October 2022 in the Official Journal L 277, page 1.
Monday, 22 November 2021
European Data Protection authorities speak up on targeted advertisement
Dear readers,
this is a teaching-intensive autumn across European universities - with all the excitement, uncertainty and overall strains of being mostly back in class after over a year of mostly living room lecturing.
This, however, should not mean that we let crucial developments go unnoticed: last week, in fact, the European Data Protection Board (EDPB) has issued its most resolved opinion yet on the matter of privacy and behavioural tracking. Cookies, in other words - a staple not only of many people's secret kitchen stashes but also of equally elusive locations on our devices.
The occasion for issuing this opinion is commenting on the Commission's Digital Services Act, which according to the Board should be brought more clearly in line with data protection rules. Couched among guidelines and standpoints on a number of highly salient issues - from counterterrorism to face recognition AI - the EDBP has called for
1) considering a phase-out of targeted ads based on "pervasive tracking";
2) in any event, prohibiting targeted ads addressed at children.
The opinion does not expand on the reasons for such standpoint, but mainly refers to previous positions contained in comments on the DSA by the European Data Protection Supervisor (EDPS) and the European Parliament. In fact, criticism of the current rules' focus on informed consent has been around at least for the better part of the past decade (see for a classic Frederik Borgesius).
The European data protection board is composed of representatives from the national data protection authorities. As a collective body mirroring positions in the Member States, its position can perhaps have more sway than the occasionally more principled stances of the EDPS.
Tuesday, 15 December 2020
Who is on the nice or naughty list?: Digital Markets Act and Digital Services Act
Digital Markets Act
The advantage of the Digital Markets Act for consumers lies definitely in the environment this regulation aims to promote: more competitive, encouraging consumers to seek for, access and conclude the best possible deal, not locking them into digital services contracts they conclude. The gatekeepers to the market are companies located anywhere in the world, provided they offer their core platform services to business users in the EU or end users in the EU (Article 1(2)). The end users do not have to be consumers (Article 2(16)) as they may be legal persons, as well as natural persons.
Core platform services could be (Article 2(2)): online intermediation services, online search engines, online social networking services, vide-sharing platform services, number-independent interpersonal communication services, operating systems, cloud computing services, advertising services.
Gatekeepers are defined (Article 3) as providers of core platform services who have a significant impact on the internal market (e.g. annual EEA turnover exceeds EUR 6.5 billion in the last 3 financial years and they provide the core platform service in min 3 Member States), operate a core platform service which serves as an important gateway for business users to reach end users (e.g. more than 45 million monthly active end users located in the EU and more than 10.000 yearly active business users established in the EU), and enjoy an entrenched and durable position in its operations (or could do so) (e.g. where it was an important gateway for the last 3 financial years). These thresholds are just an example, as the Commission has the power to designate other providers as gatekeepers (Article 3(6)). Generally, it is the gatekeeper who should notify the Commission that they have reached such a position (Article 3(3)).
Articles 5 and 6 specify the main obligations of the gatekeepers. These seem to focus on: limiting the scope in which platforms use personal data, not limiting a possibility for business users to offer the same products or services to end users on different platforms under different conditions, not restricting communication of business and end users outside the platform, allowing end users to un-install any pre-installed software applications on the platform, as well as facilitate installation of third party software applications, not prioritise in ranking products or services offered by the gatekeeper, limit bundling of services, facilitate switching of services, transparency obligations.
Further provisions of the proposal for this regulation are devoted mainly to various monitoring obligations, review processes and consequences for non-compliance (mainly fines).
Digital Services Act
The main advertised advantage of this act for consumers is that the provisions of the regulation will offer them 'more choice, lower prices', 'less exposure to illegal content' and 'better protection of fundamental rights' (see here). These benefits are to result if the aims of the Regulation as defined in its Article 1(2) are achieved. Similarly to DMA this Act also will apply to providers of intermediary services regardless of their place of establishment, as long as the services are provided to recipients established or located in the EU (Article 1(3)). A recipient of the service again does not need to be a consumer, as it may be a legal person (Article 2(b)).
The obligations imposed by this Act are to bind providers of intermediary information society services (Article 2(f)), that is services where providers are either a: 'mere conduit' (transmitters of information provided by recipients or providing access to the communication network, e.g. Internet access providers), 'caching' service (where they temporarily store the transmitted information to make its transmission more efficient), 'hosting' service (where they store the information provided for and as requested by the recipient of the service, e.g. cloud services).
Articles 3-5 specify when providers of various information society services in the above-mentioned categories may be exempted from the liability for the content of that information, i.e. when it is illegal content. These provisions aim then to ensure that consumers (citizens) are indeed exposed to less illegal content (will they suffice? that is the question that goes beyond providing a short summary here!).
Importantly, if a provider conducts a voluntary own-initiative investigation that does not change the status of their exemption from the liability as determined in Articles 3-5 (Article 6). Thus self-checks are to be encouraged. However, there remains no general obligation to monitor the information which providers transmit or store (Article 7), similarly to the current E-Commerce Directive provision.
Articles 8-9 regulate how providers are to comply with orders about removing illegal content and providing information about users, respectively.
Other interesting for us provision may be:
- Article 12 - which obliges providers to transparently outline any content moderation procedures and tools in their terms and conditions;
- Article 13 - requiring providers to publish an annual report on past year's content moderation; Article 23 adds also an obligation to provide reports on the number of disputes submitted to ADR, number of suspensions (Art 20) and any use of automatic means for the purpose of content moderation;
- Article 14 - providers of hosting services are required to facilitate users notifying them about illegal content and allowing them to act upon such notifications (notice and action);
- Article 15 - requires hosting services providers to notify, in a reasoned and transparent manner, users if their content is removed/disabled/etc as illegal;
- Article 17 - sets out rules on an internal complaint-handling system for online platforms, which they are to provide to recipients of services for min 6 months following either removal or disabling of access to the information provided by recipients, suspension or termination of the service, suspension or termination of the account;
- Article 17(4) and Article 18 - recipients of services should be informed about and have access to ADR;
- Article 19 - specifies rules for processing of notifications of trusted flaggers;
- Article 20 - determines when providers could suspend users' accounts due to misuse, i.e. frequent posting of illegal content;
- Article 22 - specifies that online platforms are required to obtain information from traders allowing to trace identity and location of such traders and should check the reliability of this information;
- Article 24 - online advertising has to be clearly displayed as such, together with parameters used to determine the recipient to whom the advertisement is displayed;
- Articles 26-33 - contain obligations for very large online platforms, in the same categories as described above, but with stricter provisions.
Further provisions pertain to various enforcement and penalties issues, as well as the plans to encourage adoption of standards and codes of conduct.