Thursday 27 October 2022

OECD report on Dark Patterns

 Yesterday, aka 26 October 2022, the OECD has released a report on Dark Patterns which had been in the making for almost two years. LLM students who would like to write about the topic or just about anyone looking for a clear intro to the subject - this report is your friend! It contains not only a helpful classification of different types of dark patterns but also a quite comprehensive review of relevant regulatory frameworks/interventions, known case-law and much (if not all, and if arguably too US-centred and English-based) of the literature you may also want to look at, including... Joasia's 2019 JCP paper The Transparent Trap! Kudos there.

A working definition is provided at the outset which may or may not gain traction in the field: dark patterns, accordingly, are 

"business practices employing elements of digital choice architecture, in particular in online user interfaces, that subvert or impair consumer autonomy, decision-making or choice. They often deceive, coerce or manipulate consumers and are likely to cause direct or indirect consumer detriment in various ways, though it may be difficult or impossible to measure such detriment in many instances."

[The first part of the report, where dark patterns are typified and their impact assessed, I skip for now - but you can find it all online!]

The report acknowledges that more enforcement is necessary in the EU, while ultimately praising the UCPD's relative ability to address the problem in comparison with other instruments: if on the one hand resonance with the black listed items in the annex makes it possible to address certain black patterns with a degree of legal certainty, the report observes, the "principle-based" prohibition of unfair commercial practices works quite well to cover technological and commercial developments like the ones at hand. 

One critical point that is (thankfully) mirrored in the report is known criticism of the average consumer standard: this standard is hard to square with consumers' apparent vulnerability to dark patterns & other online perils &, the report observes, seems particularly problematic in the context of increasing online personalisation. The report also highlights criticism of disclosure rules, in particular as a way of preventing consumers from falling for dark traps: it turns out, the report concludes, that all experiments trying to measure the effects of disclosures in this area failed to detect any serious improvement. Hence the relevance of information may be limited to broader education campaigns and possibly to a limited set of dark patterns. 

The report also interestingly reviews examples of technical supports that are being developed - essentially, dark pattern-blockers for one's browser. These are, apparently, useful in some cases but less so when the dark patterns is not to be "written away" in code (p 47). I would like an app like that though!

As a scholar who reads Law & Econ work with a mix of interest and skepticism, I was less impressed by the report's discussion of nudges on page 37, under "Digital choice architecture". The title reflects a trend that has been going on for a long time of course; the report, however, brings together under one technique concerns that may need to be kept separated. "Privacy by design", that is mentioned as example, is not the same as a "bright pattern" based on extrapolating "welfare enhancing" choices from supposed "preferences or expectations". While the report necessarily gives a limited overview on each issue, conflating privacy protection with "consumertarian" views and hard-core nudge advocates is to my mind quite problematic.

Anyway, this is really a good starting point but also, as far as I can tell, a fairly comprehensive restatement that those already in the debate will also benefit from. Recommended read!