First Amazon, then Google and now Apple. The unethical practice of listening to people's private moments is a far bigger problem these major corporations let on. Apple, like its rivals, has outsourced the task of listening to Siri recordings and the contractors are to grade the responses by Apple's voice assistant based on various parameters.

While the idea is to improve Siri and let the virtual assistant understand you better and recognise what you say without having to go at it many times, it is still a privacy breach as actual people are listening to countless private moments. What makes it worse is the fact that Siri is recording people's conversations even when it is not addressed to.

Siri's accidental activations problem is very real and the human oversight of the digital voice assistant is a breach in people's trust in Apple. The Cupertino-based tech titan hasn't denied the practice, but assured that the contractors are bound to "strict confidentiality requirements."

"A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements," Apple told the Guardian.

Siri
Siri is not as innocent as you thinkReuters

But that didn't stop an Apple contractor and whistleblower from revealing Apple's dirty secrets in front of the world. The anonymous contractor working for the firm expressed concerns about Apple keeping this practice a secret, especially considering how regular Siri's accidental activations are and the kind of content it picks up.

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data," the contractor told the paper.

According to the contractor, Apple only gives the staff an option to report accidental activations as technical glitches and nothing more. There's no way to report sensitive recordings and anyone with a motive can easily identify who's in those recordings using the accompanying data such as location, contact details and app data.

Apple, HomePod,Siri smart speaker,launch, price, availability
Apple HomePod Siri interactionApple Press Kit

These sensitive recordings are picked up by Siri when it thinks it hears the "wake word." According to the Guardian, Siri can mistake the sound of a zip as a trigger to wake up and in case of an Apple Watch, raising and then hearing speech can be considered a command to wake up and start recording.

Apple isn't alone in this practice as recent revelations showed Google and Amazon analysing audio recordings from respective digital assistants. But there's a huge difference between Siri and other two digital assistants. In the case of Google Assistant and Alexa, users can opt-out of some uses of their recordings, but Apple doesn't let its users disable Siri.

The only way Apple product users can do is be careful of what they say around Siri-activated iDevices like iPhone, HomePod and Apple Watch. The irony of the situation isn't lost entirely. For a company that is built on the reputation of putting users' privacy first, Apple isn't gaining any brownie points here.