Thursday, 4 July 2019

Recent developments in online content moderation

The discussion about the role of platform operators in content moderation is perhaps as old as online intermediaries themselves. Since the very beginning it involved a delicate balance between conflicting considerations: eg how to protect the freedom of expression, the freedom to conduct a business and the right to an effective remedy while ensuring that intellectual property and personality rights are safeguarded and harmful content does not thrive. The solution established by Articles 14 and 15 of the E-Commerce Directive has continuously been put into test (for the latest installment in the CJEU case law series, see C-18/18 Glawischnig-Piesczek). The approach of choice of the outgoing European Commission has been to keep the legal framework intact for the time being, while pursuing a set of non-legislative initiatives such as recommendations and codes of conduct (eg on hate speech and online disinformation). 

Screenshot of Facebook website
This did not hinder national stakeholders from taking further action. In particular, in 2017 the German lawmaker came up with a new law - the so-called Network Enforcement Act (NetzDG), which imposed a legal obligation on the operators of social media platforms to take down illegal content within the set time limits and report the number of complaints. The act came into force in January 2018 and it has just shown its teeth for the first time: with a 2 million euro fine imposed on Facebook. Interestingly, the decision of the Federal Office of Justice (Bundesamt für Justiz) does not concern the failure of the platform operator to remove illegal content, but rather its alleged non-compliance with transparency duties. According to the German enforcer, the option for making a complaint under NetzDG was harder to find on Facebook than an option for complaining that a post violated the platform’s "community standards". Time will tell whether the decision holds in the appeal proceedings.

Meanwhile, allegations against insufficient blocking and reporting are not the only problems faced by the online platforms these days. Two ongoing legal proceedings in Poland offer an illustrative counterexample. In 2016 a case was brought against Facebook by the president of an association Reduta Dobrego Imienia (Polish League against Defamation) against an alleged overblocking of right-wing content. More recently, Facebook was yet again sued by a Polish NGO whose content was blocked by the platform operator, this time from the opposite side of the social-political spectrum. The dispute in SIN v. Facebook concerns the blocking of a site providing reliable information about the use of psychoactive substances; not to support their consumption, but as part of a harm reduction strategy. The cases are still pending, the big news so far is the in both cases Polish courts recognized their jurisdiction on the basis of Article 7(2) Regulation 1215/2012.