Apple to stop storing Siri recordings without permission after privacy backlash

Apple will stop storing audio recordings of what users say to Siri unless they explicitly opt in, following a privacy backlash against the widespread practice of humans listening to users' voice clips without their knowledge.

The phone maker apologised to users and promised that in future only Apple employees, not outside contractors, would listen to audio from users who gave their permission.

Apple's listening programme was suspended earlier this month after a whistleblower revealed that contractors listening to audio clips had overheard drug deals, couples having sex and sensitive medical information.

The company now says it will restart the programme this Autumn, but only for users who choose to be part of it and only after its next operating system update becomes available.

Google, Facebook and Microsoft have all been forced to suspend the controversial practice, which is used to improve AI speech recognition systems but which was not explicitly disclosed to users in their privacy policies. Amazon continues to use human listeners.

"At Apple, we believe privacy is a fundamental human right," said the company. "We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. 

"We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. 

"As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologise."

The company said it would still let humans read computer-generated transcripts but that Apple employees will not know the identities of the users whose clips they listen to.

The practice of using human ears to check AI assistants' work, without explicitly asking users for permission, has drawn criticism from privacy advocates and scrutiny from data protection authorities.

AI voice assistants are frequently activated by accident and may record audio that their users never intended them to hear. Last year Amazon's Alexa misinterpreted a private conversation happening in the background as a series of requests for it to record the conversation and send the recording to one of its owners' friends. 

Apple's whistleblower told the Guardian that there had been "countless instances of recordings featuring private discussions" including "seemingly criminal dealings" and "sexual encounters", in part because Siri often interprets the sound of a zip as a human voice saying its name.

Source: The Telegraph