The Web Foundation’s Tech Policy Design Lab is working on an intriguing-looking project to combat deceptive design — also known as dark patterns* — with the goal of creating a portfolio of UX and UI prototypes that it hopes to persuade tech companies to adopt and policymakers to be inspired by as they craft rules to make the online experience less exploitative of web users. The Design Lab was established last year as the “active arm” of a larger Web Foundation effort, which was first revealed in late 2018 — when the non-profit digital access organization suggested a “Contract for the Web” in honor of World Wide Web pioneer Sir Tim Berners-30th Lee’s birthday.
The purpose of the misleading design patterns project, according to Kaushalya Gupta, its program management lead who also runs the Tech Design Lab, is to bring “human-centered design” to bear on essential web interactions. The initiative will be fueled by a series of workshops later this year that will bring together a diverse group of industry and user stakeholders to co-design prototypes. The goal is to provide a set of pro-user interface design templates that can be used as benchmarks to influence how tech platforms shape and mould these essential (and sometimes cynically constructed) decision/interaction pipelines.
Gupta notes, “The Lab is a location where people’s experiences shape policy as well as product design.” “Where solutions consider the broad range of people who utilize digital technologies.” In terms of policy formulation, design thinking and human-centered design shape our approach.” The deceptive design patterns initiative was preceded by a survey, in which the Lab polled the thousands of businesses and entities who had signed up for the Contract to choose what should be its initial emphasis — narrowing it down from 200 “promising subjects” to deceptive design.
Western internet users may be most familiar with deceptive design patterns in a niche like cookie consent banners, which typically (intentionally) fail to provide parity between the ease of agreeing to give up your privacy to the data industrial adtech complex and the difficulty of refusing tracking — making the latter extremely difficult (if they even offer that choice at all). In e-commerce, there are plenty of regrettably familiar examples as well.
Consider how Amazon aggressively nudges users toward inadvertently agreeing to sign up for its subscription-based Prime membership scheme — teasing a carrot of free shipping at the point of purchase to distract the shopper from the simultaneous sleight of hand, as shipping is only free if you also agree to sign up for a free trial of Amazon’s Prime scheme, after which the e-tail giant will begin to bill you for what is normally a paid service. (Of course, if you unintentionally sign up for this “free” shipping/trial, you’ll have to go on a not-so-merry dance through numerous menu layers to find the option to cancel Prime.)
However, Gupta makes the important point that deceptive design is also prevalent in the global south, where design tricks may be disproportionally harmful — given that many more web users with little online experience are being exposed to this cynical, tricksy stuff, who are not as accustomed to the “usual tactics,” making it harder for them to detect and avoid dark patterns and the harms that follow. She also mentions that people in the global south are more likely to have to use smaller gadgets to access the internet, which might make false nudges simpler to carry off. With less screen real estate, it’ll be more difficult to see what’s really going on.