Wednesday 26 June 2024

Online safety and "vulnerable" consumers: the new OFCOM's draft codes of practice

The English Office of Communications (OFCOM) has recently released its new draft "Children's safety online" codes of practice in line with the Online Safety Act (OSA) the main act providing for online safety in the UK, which became law in October 2023. 

OFCOM is responsible for enforcing the OSA. Its draft codes of conduct are meant to complement it and to suggest platforms (especially, social media) how to shape a safer environment, with a focus on underage consumers. Accordingly, the guidelines encourage platforms to be stricter in setting up adequate procedures for age-checking, to provide users with the due instruments to report harmful content, to remove it when necessary, and to clarify the systems used to monitor and moderate online content, particularly with respect to young people.

Platforms ought to take measures based on their size, the purpose of their services, and the risk of harm to children. Thus, the Online Safety Act adopts a risk-based approach, similar to the European Digital Services Act (Regulation EU, 2022/2065). The purpose is indeed the same: ensuring that online service providers implement procedures that are able to tackle the threats to online safety while safeguarding users’ privacy rights and freedom of expression. The two acts therefore cover issues such as content moderation, and other generative AI outputs, such as deep fakes. They both show the increasing attention placed by lawmakers on new forms of digital vulnerability, and how to address them.

OFCOM’s choice to enact codes of conduct is in line with the European approach, too; in fact, the EU lawmaker has also emphasized the relevance of codes of conduct and soft laws in shaping a safer digital environment. The draft codes of conduct provide for transparency, placing on platforms the duty to make available to the public their risk-assessment findings, and providing systems to easily report illegal content. 
And the Agency has gone even further. Indeed, it has declared that it could even impose a ban on people under the age of 18 to access platforms that fail to comply with its guidelines. 

However, some questions arise from reading the new OFCOM document. Will the procedures for ascertaining the age of users lead to the collection of an excessive amount of personal data, violating the data minimization principle under the GDPR? Are OFCOM’s codes too restrictive if compared to the DSA, forcing platforms to adopt a “double standard” between users based in the UK, and those that are based in the rest of Europe? Is the Online Safety Act “technologically neutral” enough when differentiating platforms’ obligations on the basis of the content type or the most comprehensive approach adopted by the DSA, based on equal risk mitigation for all the illegal content, is to be preferred?

Despite these concerns, which undoubtedly will be further examined by many scholars and privacy advocates in the coming months, the guidelines seem promising for improving the online safety of English users, especially children. The true test will be their implementation and enforcement.