Alexa, are you profiting from the illegal storage and analysis of kids' voice commands?
I'm sorry, I won't comment on ongoing litigation: Two privacy lawsuits filed against Amazon
Amazon has been hit with two lawsuits in the US regarding the recording and storage of children's voices through its Alexa digital assistant.
The lawsuits [PDF] were lodged in California and Washington courts this week by the guardians of unnamed children aged 8 and 10, and are largely identical: both allege that Amazon is illegally profiting from the analysis of requests and commands minors have made to the Echo family of devices, as well as an increasing range of other devices that use the voice-recognition technology.
The lawsuits argue that while "most people believe that when they speak to an Alexa-enabled device, it converts their voice into a set of digital computer instructions… They do not expect that Alexa is creating and storing permanent recording of their voice."
That's only half the problem though, the lawsuit argues. It alleges that Amazon is keeping a permanent record of those recordings and using them "for its own commercial benefit."
In particular, the lawsuits claim that Amazon is "creating voiceprints of users, which can be used to identify them when they speak to other devices in other locations," and as such is creating a "massive database of voice recordings containing the private details of millions of Americans."
It points out that in California and Washington, there are specific laws that prohibit the recording of any oral communication without the consent of everyone involved. And it rejects the idea that Amazon has obtained that consent when users first set up their device.
"There is a large group of individuals who do not consent to be recorded when using an Alexa-enabled device and who use Alexa without any understanding or warning that Amazon is recording and voice printing them: children."
What are the plaintiffs worried about? Future use and abuse of such voice databases. "It takes no great leap of imagination to be concerned that Amazon is developing voiceprints for millions of children that could allow the company (and potentially governments) to track a child’s use of Alexa-enabled devices in multiple locations and match the uses with a vast level of detail about the child’s life, ranging from private questions they have asked Alexa to the products that have used in their homes."
Why keep the recordings?
At the center of the argument is the question of whether Amazon actually needs to store the voice recordings it makes. The internet goliath says it does this to improve its service, though it is notoriously vague about precisely how that works.
It's unclear how accurate Amazon can be about who it is listening to: to the best of our knowledge, it has yet, for example, to offer individualized options when it knows exactly who it is speaking to and so change music or calendar or ordering options in response. Whether it can do that and hasn't yet provided the service out of fear it will cause people to freak out, or whether it's simply not able to be that accurate, is uncertain.
"Simply put, the more data Amazon collects, the more use Amazon has for each incremental data point Amazon collects," the lawsuits continue.
Some suspect the web giant is using the voice recordings as part of a larger database to figure out how to sell more goods and services to people. For example, if it knows about everyone who lives in your home – perhaps due to orders and deliveries through Amazon.com – it will also be able to connect voices within that home to individuals. And then connect what those individuals request through Alexa-enabled devices to those individuals.
You know that silly fear about Alexa recording everything and leaking it online? It just happenedREAD MORE
It will also have your name and so be in a position to connect countless other information gleaned from other databases to build a full profile of specific individuals.
Is it doing that? We don't know. But we do know that that kind of database on millions of netizens is what has made companies like Google and Facebook some of the wealthiest corporations on the planet, so it's a fair bet that Amazon is hoping to do likewise.
Besides, Amazon doesn't need to do any of this, the lawsuit argues: it is perfectly possible to offer digital assistant services without making and storing recordings. And it references other services like Apple's Siri and driving technology from Mercedes-Benz to argue that you only need to store voice recordings for a short time to be effective.
In addition to California and Washington, there are another seven US states that require consent from all parties for a voice recording to be made: Florida, Illinois, Maryland, Massachusetts, Michigan, New Hampshire and Pennsylvania. The lawsuit hopes to pull people from all those states into a larger class-action lawsuit against Amazon.
With the rapidly expanding use of this technology in the United States – there are an estimated 100 million Alexa-enabled devices alone – it is inevitable that lawsuits will seek to put some kind of legal constraints on what can be done with the information that results.
Kids are, naturally enough, a good group to focus on since laws are stricter over what can be done when it comes to minors.
Just last month, campaigners complained to America's trade watchdog, the FTC, about their growing unease about Amazon's new Echo Dot Kids Edition and its handling of youngsters' privacy. Some groups also recommend parents not buy or use the device, despite Amazon's claims that it actively seeks parental consent for any recordings.
Spokespeople for Amazon declined to comment on the ongoing litigation. ®