Researchers from the Massachusetts Institute of Technology and Stanford University have found that people say they want privacy but make choices suggesting the opposite, and can be easily manipulated through interface design, reassuring statements, and pizza.
In "Digital Privacy Paradox: Small Money, Small Costs, Small Talk," a paper published Monday through the National Bureau of Economic Research, authors Susan Athey, Christian Catalini, and Catherine Tucker explore a phenomenon that has been widely observed: The disconnect between what people say about privacy and what they do.
It's a discrepancy that calls into question the validity of notice and consent, the foundation of privacy rules.
Catalini, an assistant professor at the MIT Sloan School of Management, said in a phone interview with The Register that there has been a market failure around online privacy.
"We tend to discount that the choices we make will have consequences," he said, suggesting that the notice and consent model is broken.
Other researchers have explored similar territory. Last year, in a Penn State Law Review paper titled "Online Privacy and the Invisible Market for Our Data," New York City Law Department assistant corporation counsel Rebecca Lipman observed that no one reads privacy policies and companies can collect data with few limitations.
"There is nominally a notice and choice regime in place via lengthy privacy policies," Lipman wrote. "However, virtually no one reads them. In this ill-informed environment, companies can gather and exploit as much data as technologically possible, with very few legal boundaries."
To plumb the privacy paradox, Athey, Catalini, and Tucker used data from the MIT digital currency experiment, in which every student was offered $100 in Bitcoin during the fall of 2014.
Were the experiment focused on investing, the fivefold increase in Bitcoin value since then would be worth noting, but its aim was to establish a cryptocurrency community at the school. Piggybacking on that project, the researchers looked at the privacy choices presented to the 3,108 out of 4,494 undergrads who chose to participate by creating digital wallets for their Bitcoin windfall.
They conducted three tests:
The first looked at whether an incentive – "one free pizza you can share with your friends" – would influence whether or not participants would reveal a friend's email address, something consumers consider almost as sensitive as social security numbers.
Perhaps unsurprisingly, your friends will give you up for a pizza. "It appears that whatever the stated privacy preference is, students share their friends' data when given a small incentive," the study says.
"When you think about some of the results of our paper, they're possibly depressing because more and more of our data is becoming digital," said Catalini. "The fact that it's so easy to push people into bad privacy decisions is alarming."
The second test explored how sign-up process friction influenced the choice of Bitcoin wallets with varying levels of privacy protection. As Microsoft understood when it presented European customers with a browser choice menu, interface design can influence decisions.
"Small frictions, such as those generated by the ranking of options on a web page, generated large differences in the technology adopted," the report says.
The third test looked at how presenting participants with information about the privacy benefits of PGP, as a means of email encryption, shaped views of what's more or less a separate issue: Bitcoin privacy.
Evidently, reassuring statements about the privacy benefits of PGP encryption made survey participants more willing to disclose Bitcoin-related information to the public, an intermediary, and the government.
In an email to The Register, Susan Athey, professor of economics at Stanford, said the paper does not address how legislation should be calibrated. "It suggests that users' preferences for privacy may not be particularly strong, which has the implication that if privacy regulation imposes costs, it can be important to carefully consider whether preferences are strong enough to outweigh the costs in the particular context," she said.
Athey argues that privacy discussions should focus more on how privacy choices are presented. "The results that navigation costs matter suggests that how information is presented can be crucially important," she said. "More broadly, our results that navigation costs influence choice suggest that users may not be willing to expend a lot of time to evaluate different options."
"Although our paper does not directly speak to this, I believe that standardizing and simplifying information about privacy and security policies would make it easier for consumers to compare choices without expending a lot of time each time they are confronted with a choice," she said.
Catalini echoed that view. "If companies ask for consent, they should ask in a way that's not designed to induce the answer they want," he said. ®