This article is more than 1 year old
Amazon and Google: Trust us, our smart-speaker apps are carefully policed. Boffins: Yes, well, about that...
Who can you trust these days?
The voice applications people use with their Amazon Alexa and Google Assistant smart speaker devices have privacy policies, but most users don't read them and neither device maker has shown much concern about policy problems or inconsistencies.
Computer scientists from America's Clemson University – Song Liao, Christin Wilson, Long Cheng, Hongxin Hu, and Huixing Deng – provided The Register with a pre-publication copy of research they conducted into voice assistant apps and their privacy rules.
In a paper titled, "Measuring the Effectiveness of Privacy Policies for Voice Assistant Applications," the boffins analyzed 64,720 Amazon Alexa skills and 2,201 Google Assistant actions – apps that interact with the voice-controlled mic-speaker hardware people use to bug their own homes – and found them largely lacking.
The academics attributed the Google-favoring gap to Amazon's lax skill certification process, the focus of previous research.
"This is achieved by collecting personal information through the conversational interface (e.g., asking users’ names). Even though this data collection is prohibited, the certification system of Amazon Alexa doesn’t reject such skills."
Amazon's auditing of Alexa Skills is so good, these boffins got all 200+ rule-breaking apps past the reviewersREAD MORE
The Clemson scientists have published a summary of their findings to GitHub.
Asked why Amazon and Google haven't addressed these issues when the Clemson computer scientists can flag them with a bit of Python code, Long Cheng, assistant professor in the school of computing at Clemson University and a co-author of the research paper, speculated that it may have something to do with how new these platforms are.
"They probably focus more on implementing new features/functionalities at the current stage," he said in an email, noting that Google [PDF] took the issue seriously and removed the Assistant actions with missing privacy policies. The ad biz also paid a $5,000 reward for reporting the problems.
Amazon makes privacy policies mandatory only for skills that collect personal information, Long said, adding "But we found so many Alexa skills providing meaningless privacy policies."
"The presence of so many problematic privacy policies indicates that Amazon's post-certification audits still need to be improved," he said.
Those developing voice assistant apps are often not professional developers, he said, suggesting that both Amazon and Google have optimized for quantity over quality.
The research paper also describes a survey of 66 Alexa and 25 Google Assistant US-based users, conducted through Amazon Mechanical Turk. The findings found that 52 per cent of respondents were unaware of the privacy policies of their voice assistant apps; 73 per cent rarely read those privacy policies; and 47 per cent don't know what kind of information their skills/actions are capable of collecting.
The paper's authors say they've reported their findings to Amazon, Google, and the US Federal Trade Commission.
The Register asked Amazon and Google to comment on the research.
"We've been in touch with a researcher from Clemson University and appreciate their commitment to protecting consumers," a Google spokesperson said in an emailed statement. "All Actions on Google are required to follow our developer policies, and we enforce against any Action that violates these policies."
"We have not yet been given the opportunity to review this research paper. We will closely review it when available, and engage with the authors to understand more about their work. We appreciate the work of independent researchers who help bring potential issues to our attention."
The paper concludes with the recommendation that platform owners implement a function to briefly summarize voice app privacy policies aloud, since many people only interact with voice assistant software via voice. ®