Apple stops listening to Siri audio snippets in response to privacy concerns

Apple stops listening to Siri audio snippets in response to privacy concerns
HIGHLIGHTS

Apple has suspended plans of analysing Siri recordings worldwide.

The analysis was being conducted by contractors to improve the service.

Apple also said it would be issuing a software update in the future and seek explicit user consent.

Apple leaves no stone unturned to talk about how it upholds the privacy of users in its ecosystem. The company has put up billboards reminding users of how it takes privacy seriously. There have been in the past a few brushes with the authorities where Apple has refused to grant backdoor access to its devices. Yet, the Cupertino giant was recently caught using recordings of Siri queries to make the service better, according to a story by Guardian. Within a week of the story breaking, Apple suspended the programme worldwide.

The Guardian story cited a contractor at a firm hired by Apple to listen to a few seconds of Siri recordings, chosen at random and anonymously, to determine whether the digital assistant was hearing user queries correctly, or being called upon by mistake.

Apple also said it would be issuing a software update in the future, which will take explicit consent from the users to choose whether they want to participate in the process or not.

According to the Guardian piece, Apple hires contractors to listen to snippets of audio that are not labelled by names or IDs of individuals. It’s used by the contractors to judge whether Siri is answering questions correctly and not being invoked accidentally. Apple calls this process grading. The contractor claimed the audio snippets might have had identifiable information regardless of Apple anonymising the recordings.

There’s also the question of taking explicit consent from the users for an activity where user data is analysed, which in this case has not been taken. Siri’s terms and conditions also doesn’t state explicitly that live recordings from Siri will be taken to improve the service. Apple claimed grading is performed using only 1 percent of the daily Siri queries.

As a corrective measure, Apple will ask users whether they want to participate in the program when they first set up their iOS devices. However, to be frank, most other voice assistants regularly analyse live recordings from users to improve their service, be it Amazon’s Alexa, Microsoft’s Cortana or Google Assistant.

But it’s Apple’s staunch positioning as a privacy-focused company that makes the issue look more magnified. You can’t have a company claiming to be the champion of user’s privacy and share live recordings from a voice assistant to third-party contractors. Thankfully, the issue has been addressed and a corrective update will be pushed out soon.

Digit NewsDesk

Digit NewsDesk

Digit News Desk writes news stories across a range of topics. Getting you news updates on the latest in the world of tech. View Full Profile

Digit.in
Logo
Digit.in
Logo