This article is more than 1 year old
Apple says sorry for Siri slurping voice commands of unsuspecting users
Devises three-point plan to up its privacy game
Sorry seems to be the hardest word for some - Apple has finally apologised to customer weeks after it emerged contractors had been asked by the company to listen to recordings of people using the Siri digital assistant.
More importantly, Apple said it is changing the way it does things.
Apple, which is at pains to convince users that it takes their privacy seriously, suspended recording voice commands that are part of its Siri quality evaluation process - known internally as grading - at the start of August.
"We know that customers have been concerned by recent reports," Apple said last night. "We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We decided to make some changes to Siri as a result."
Siri uses a random identifier, a "string of letters and numbers associated with a single device" to keep tabs on data as it is being handled instead of connecting it to a user's Apple ID or phone number. After six months of being on a server, the device's data is "disassociated from the random identifier".
The audio interactions with the digital assistant and a "computer-generated transcription of it", are used to train Siri to help measure and improve the quality of the responses.
"As a result of the review, we realise we haven't been fully living up to our high ideals, and for that we apologise," said Apple. The company plans to resume the practice later in the autumn/ fall time period, when software updates are rolled out to the legion of users, after three changes are enacted.
First, Apple will no longer retain the audio recording of users' voice commands to Siri, but will instead use computer-generated transcripts. Users will be able to opt in to have their interactions recorded or opt out at any time. The third point is that only Apple staff will be able to listen to the recording of those that opt in.
"Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri," Apple added.
The vendor signed off by saying, "We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for constantly pushing us to improve."
And those contractors? What happens to the 300 freelancers operating in Apple's Irish ops that were processing that data? According to the Guardian yesterday, Apple has let them go with a week's notice.
Apple isn't alone here: both Amazon (Alexa) and Google (Google Home) also record users making voice commands to their respective digital assistants, processes that have similarly lacked transparency. Belgian journos last month claimed they'd heard over a thousand of the excerpts, "153 of which were conversations that should never have been recorded and during which the command 'Okay Google' was clearly not given." Facebook likewise hired hundreds of contractors to listen to clips of its users' voice calls to transcribe parts of conversations its AI software couldn't understand. Those users had opted to have Facebook Messenger transcribe their calls. Microsoft also gets humans involved in the analysis of Cortana and Skype recordings.
Both Google and Amazon are being sued stateside for allegedly recording minors. In Germany, data regulators have launched a probe into Google's processes to ascertain if a violation of European privacy rules has happened. ®