This article is more than 1 year old
Apple: Ok, ok, we'll stop listening in on your Siri conversations. For now, but maybe in the future
Just don't ask us to stop recording and storing them, says tech privacy leader
Apple has hit pause on contractors listening into and making notes on recordings of people using its Siri digital assistant after the secretive practice was exposed.
Cook & Co has tried to differentiate itself from the rest of the tech industry by stressing its privacy credentials, but it has notably not ended the Siri listening program, saying only that it won't start it up again until it has "reviewed" the current system.
Apple will also not stop recording or storing the voice interactions that its millions of users have with the digital assistant. But Apple has said it will, in the future, provide the optionfor users to opt out of the system where contractors across the world are allowed to listen in to Siri recordings in order to "grade" them.
It's not clear whether users will have to actively opt-out of that process or if Apple will default to not listening to people's requests. Probably the former.
The grading process is used by Apple to figure out if the Siri response was useful, whether the request itself was something Siri should be expected to answer, and, critically, whether the recording should have happened in the first place or was accidental.
A series of media reports into what companies like Amazon, Google and Apple are doing with the recordings of users interacting with their respective Alexa, Google Home and Siri services, has put the tech companies in a privacy spotlight.
They have all been less than transparent about what actually happens with recordings with third-party contractors revealing that they regularly listen to people in their homes doing stuff that clearly isn't expect to be recorded, including arguing, carrying out business deals, talking to their doctor about medical issues, and having sex.
They're all at it
Earlier today, Google agreed to stop listening in to recordings made through its Google Home device – in Europe - after the German authorities launched an investigation into the practice to see if it violated European privacy rules. Germany's data protection commissioner ordered the online giant to stop manual reviews for three months.
Google and Amazon are also being sued in the United States for what the lawsuits claim is illegal recording of minors through these services.
German privacy probe orders Google to stop listening in on voice recordings for 3 monthsREAD MORE
When it comes to Siri and Apple's contractors listening in, the company said in a statement: "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
What is legal and illegal, reasonable and unreasonable is not a settled issue, although it is notable that all the US tech giants have used that uncertainly to do the most they can. This includes both listening in to and transcribing recordings, even when they are certain that it resulted from an "accidental" recording when the user did not intend to be listening to and the device misheard its "wake word."
Despite knowing that many of these recordings should never have occurred, the tech companies have not deleted them, and have continued to store and transcribe their contents, presumably in the hope of improving the systems. Privacy advocates are furious.
According to some of the third party contractors that spoke anonymously with reporters, when it comes to Apple's system, the Apple Watch is the greatest producer of bad recordings. ®