Dark patterns – user interfaces designed to deviously manipulate people into doing things – have become common enough on websites and in apps that almost two dozen providers have sprung up to supply behavior persuasion as a service.
And in some cases, these firms openly advertise deceptive marketing techniques, describing ways to generate fake product orders and social messages celebrating those fake orders.
This finding is one of several from seven computer science boffins – Arunesh Mathur, Gunes Acar, Michael Friedman, Elena Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan – all from Princeton University in the USA, except for Chetty, who hails from the University of Chicago.
On Tuesday, the meticulous seven published a draft research paper, Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites, that explores the prevalence of interface-driven influencing techniques.
"We found 22 third-parties that offer 'dark patterns as a service,'" said Arvind Narayanan, a professor at Princeton. "The psychology research behind nudges has been weaponized."
The researchers analyzed the top 11,000 websites, as ranked by Amazon's Alexa service, using a custom crawler that visits e-commerce sites and completes the click flow to purchase products, then saves the interfaces encountered and interactions for analysis.
The boffins found 1,841 dark patterns, representing 15 distinct types, on 1,267 of those 11,000 shopping websites – that represents about 11.2 per cent of the data set. And they propose seven categories for such user-interface tricks:
Attempting to misrepresent user actions, or delay information that if made available to users, they would likely object to.
Imposing a deadline on a sale or deal, thereby accelerating user decision-making and purchases.
Using visuals, language, or emotion to steer users toward or away from making a particular choice.
Influencing users' behavior by describing the experiences and behavior of other users.
Signalling that a product is likely to become unavailable, thereby increasing its desirability to users.
Making it easy for the user to get into one situation but hard to get out of it.
Forcing the user to do something tangential in order to complete their task.
Dark patterns, ethically dubious though they may be, are not necessarily illegal. "Not all dark patterns are illegal, but they are nonetheless problematic because they are intended to prey on our cognitive limitations and weaknesses," added Prof Narayanan.
But some do violate the law. In Europe, the Consumer Rights Directive makes Sneaking dark patterns illegal, the researchers claim. They also note that the 234 instances of deception they found on 183 websites are unlawful in the US, the EU, and other jurisdictions.
Legislators have already taken note. In April, US Senators Mark R. Warner (D-VA) and Deb Fischer (R-NE) proposed the Deceptive Experiences To Online Users Reduction (DETOUR) Act, which aims to prevent large service providers – more than 100 million monthly users – from using deceptive interface designs for software applications.
The researchers say they hope their technology for identifying dark patterns will prove useful to watchdogs. "The crawling and clustering methodology that we developed is readily generalizable, and it radically reduces the difficulty of discovering and measuring dark patterns at web scale," the researchers explained in their paper.
They also hope their work will inspire countermeasures like the creation of a website that names and shames e-commerce sites that rely on dark patterns. ®