News

Humans At Apple Listen Siri’s Recordings Of People Having Sex, Etc. [Updated]

UPDATE: Following the revelation that Apple’s Siri listens to users’ conversations, the Cupertino tech major has announced that it will stop its global internal program and stop people from Apple listening to conversations. Additionally, after reviewing Siri policies, Apple will allow users to opt-out of the program.

In a statement to Bloomberg, Apple said, “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.

While Amazon and Google have been under the negative limelight for their “smart” virtual assistants, the most recent to join them is surprisingly Apple’s virtual assistant Siri.

According to a report by The Guardian, Apple contractors have been listening to your conversations all this while to improve Siri.

What All Information Siri Gets Hold Of?

It is suggested that Siri listens to conversations such as recordings of couples having sex, confidential medical information, drug deals, and more. These conversations may be recorded through Apple devices, including the Home Pod and Apple Watch.

All this is done for quality checks and Siri is evaluated in terms of various aspects such as whether or not activation of Siri is voluntary. Additionally, they try to figure out whether or not Siri could answer the query, and if Siri’s reply proved helpful.

Apple’s Say In This

While Apple doesn’t clearly suggest that it sends users’ Siri recordings to its contractors, Apple has replied to the news suggesting that it stores a small portion of the recordings (around 1%) intending to simply improve the virtual assistant.

Apple, in a statement to The Guardian, said, “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri’s responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.

Additionally, a whistleblower at Apple wishes that the Cupertino company should make the whole process public, more importantly for the sake of protecting user’s sensitive information.

Deja Vu?

This is not the first time a virtual assistant has been saving users’ recordings; Amazon’s Alexa has been doing this for a while and Amazon recently even admitted to the sneaky work the virtual assistant does.

Even Google Assistant tends to save users’ recordings and Google admits it stores around 0.2% of the recordings to enhance the AI assistant’s skills.

While all the tech companies are trying to do is improve their services for us, the way our data is being used is not acceptable — especially this coming from Apple who doesn’t miss a chance to take a dig at others over privacy and security.

I hope they come up with better solutions in the future!

To Top

Pin It on Pinterest

Share This