This article is more than 1 year old

Amazon and Google: Trust us, our smart-speaker apps are carefully policed. Boffins: Yes, well, about that...

Who can you trust these days?

The voice applications people use with their Amazon Alexa and Google Assistant smart speaker devices have privacy policies, but most users don't read them and neither device maker has shown much concern about policy problems or inconsistencies.

Computer scientists from America's Clemson University – Song Liao, Christin Wilson, Long Cheng, Hongxin Hu, and Huixing Deng – provided The Register with a pre-publication copy of research they conducted into voice assistant apps and their privacy rules.

In a paper titled, "Measuring the Effectiveness of Privacy Policies for Voice Assistant Applications," the boffins analyzed 64,720 Amazon Alexa skills and 2,201 Google Assistant actions – apps that interact with the voice-controlled mic-speaker hardware people use to bug their own homes – and found them largely lacking.

Of these 46,768 Alexa skills (72 per cent) and 234 Assistant actions (11 per cent) had no privacy policy.

The academics attributed the Google-favoring gap to Amazon's lax skill certification process, the focus of previous research.

"After conducting further experiments on the skill certification, we have understood that even if a skill collects personal information, the developer can choose to not declare it during the certification stage and bypass the privacy policy requirement," the paper states.

"This is achieved by collecting personal information through the conversational interface (e.g., asking users’ names). Even though this data collection is prohibited, the certification system of Amazon Alexa doesn’t reject such skills."

The boffins also observed that of the 243 Assistant actions recorded without a privacy policy, 101 had been developed by Google.

What's more, they found 1,755 Alexa skills and 80 Google actions with broken privacy policy systems, along with various other problems like duplicate privacy policies URLs shared across different apps and privacy policies that offer descriptions inconsistent with the app's actual function.

Alexa photo via Shutterstock

Amazon's auditing of Alexa Skills is so good, these boffins got all 200+ rule-breaking apps past the reviewers

READ MORE

Worse still, Amazon and Google both offer voice assistant apps that violate their own rules. Amazon's Weather skill, for example, collects location data but doesn't provide a privacy policy link in its store description.

The Clemson scientists have published a summary of their findings to GitHub.

Asked why Amazon and Google haven't addressed these issues when the Clemson computer scientists can flag them with a bit of Python code, Long Cheng, assistant professor in the school of computing at Clemson University and a co-author of the research paper, speculated that it may have something to do with how new these platforms are.

"They probably focus more on implementing new features/functionalities at the current stage," he said in an email, noting that Google [PDF] took the issue seriously and removed the Assistant actions with missing privacy policies. The ad biz also paid a $5,000 reward for reporting the problems.

Amazon makes privacy policies mandatory only for skills that collect personal information, Long said, adding "But we found so many Alexa skills providing meaningless privacy policies."

"The presence of so many problematic privacy policies indicates that Amazon's post-certification audits still need to be improved," he said.

Those developing voice assistant apps are often not professional developers, he said, suggesting that both Amazon and Google have optimized for quantity over quality.

The research paper also describes a survey of 66 Alexa and 25 Google Assistant US-based users, conducted through Amazon Mechanical Turk. The findings found that 52 per cent of respondents were unaware of the privacy policies of their voice assistant apps; 73 per cent rarely read those privacy policies; and 47 per cent don't know what kind of information their skills/actions are capable of collecting.

Also, 75 per cent of respondents said they would enable a skill intended for kids without reading its privacy policy.

The paper's authors say they've reported their findings to Amazon, Google, and the US Federal Trade Commission.

The Register asked Amazon and Google to comment on the research.

"We've been in touch with a researcher from Clemson University and appreciate their commitment to protecting consumers," a Google spokesperson said in an emailed statement. "All Actions on Google are required to follow our developer policies, and we enforce against any Action that violates these policies."

"We require developers of skills that collect personal information to provide a privacy policy, which we display on the skill’s detail page, and to collect and use that information in compliance with their privacy policy and applicable law," an Amazon spokesperson said in an emailed statement.

"We have not yet been given the opportunity to review this research paper. We will closely review it when available, and engage with the authors to understand more about their work. We appreciate the work of independent researchers who help bring potential issues to our attention."

The paper concludes with the recommendation that platform owners implement a function to briefly summarize voice app privacy policies aloud, since many people only interact with voice assistant software via voice. ®

More about

TIP US OFF

Send us news


Other stories you might like