The
UK Competition and Market’s Authority recently published a report on the
consequences of the online platforms’ use of algorithms (‘sequences of
instructions to perform a computation or solve a problem’) for consumer
protection and for competition (here).
This report builds on the CMA’s 2018 paper on pricing algorithms (here).
The report starts by highlighting that the increasing sophistication of
algorithms usually means decreasing transparency. The CMA’s report acknowledges
the benefits of algorithms to consumers, such as the possibility to save
consumers’ time by offering them individualized recommendations. Additionally,
algorithms benefit consumers by increasing efficiency, effectiveness, innovation
and competition. However, the main goal of the report is to list (economic) harms
caused to consumers as a result of algorithms.
The
report highlights that big data, machine learning, and AI-based algorithms are
at the core of major market players such as Google (e.g. their search
algorithm) and Facebook (e.g. their news’ feed algorithm). The CMA also
acknowledges that many of the harms discussed in this report are not new but
were made more relevant by recent technological advances. Finally, the report
acknowledges that the dangers brought by algorithmic regulation are even greater
where it impacts consumers significantly (such as decisions about jobs, housing
or credit).
The
harms discussed in the report deal mainly with choice architecture and dark
patterns (e.g. misleading scarcity messages on a given product or misleading
rankings). Additionally, personalization is depicted as a particularly dangerous
harm, since it cannot be easily identified and because it manipulates consumer
choice without that being clear to consumers. Personalization is also worrying
because it targets vulnerable consumers. In particular, the CMA is worried
about possible discrimination as a result of personalization of offers, prices and
other aspects.
Personalized
pricing implies that firms charge different prices to different consumers according
to what the firm (and their algorithms) think that the consumer is willing to
pay. While this has some benefits – like lowering search costs for consumers,
the CMA warns that consumers might lose trust in the market as a consequence of
personalized pricing practices. While some personalized pricing techniques are
well-known – such as offering coupons or charging lower prices to new customers, others are more opaque and harder to detect. Non-price related personalization
is also described as potentially harmful, such as personalized search results
rankings or personalized recommendation systems (e.g. what videos to show next).
In particular, the CMA warns that these systems may lead to unhealthy overuse or
addiction of certain services by consumers and to a fragmented understanding of
reality and public discourse.
Additionally,
the use of algorithms harms competition since it can exclude competitors (e.g.
through platform preferencing, via ranking, of their own products). Through
exclusionary practices, dominant firms can stop competitors from challenging
their market position. A prominent example of this is that of Google displaying
its own Google Shopping service in the general search results page more favorably
than competitors that offer similar services. Finally, the CMA report zooms in
on algorithmic collusion, or the use of algorithmic systems to sustain higher
prices.
The
report also highlights the obstacles brought by lack of transparency,
particularly when it comes to platform oversight. The CMA warns that this lack
of transparency and the misuse of algorithms may lead consumers to stop
participating in digital markets (e.g. deleting social media apps). This justifies,
in the CMA’s opinion, the regulators’ intervention. In particular, the CMA
considers that regulators can provide guidance to businesses as to how to
comply with the law or to elaborate standards for good practices. Overall, the
report brings attention to the fact that many laws in place do not apply to algorithmic
regulation, such as to discrimination in AI systems. Moreover, the CMA highlights
that the application of consumer law to protect consumers against algorithmic discrimination
is still an unexplored area.
The
report ends with a call for further research on the harms caused by algorithmic
regulation. The CMA suggests techniques to investigate these harms that do not depend
on access to companies’ data and algorithms, such as enlisting consumers to act
as ‘mystery shoppers’ or through crawling or scraping data from websites. The CMA
also suggests specific investigation techniques when there is access to the
code.
Overall,
this is an extremely comprehensive report that not only explains the biggest consumer
harms brought by AI regulation but also contains several practical examples, as
well as concrete methodological suggestions for further research and for better
enforcement. Definitely a recommended read for both academics and practionners alike.