Apple and its products are synonymous with privacy, but recently the company had to face severe backlash over a program it ran to grade Siri responses. Apple had hired third-party contractors to listen in on anonymised recordings of Apple users asking questions to Siri. The review process, aimed at understanding if Siri is invoked by mistake, has temporarily been shut down, TechCrunch has confirmed.

Apple said in a statement that the controversial program is under review and that it has suspended Siri grading across the globe. This comes shortly after Google made a similar announcement, but only in the EU. Apple is taking this seriously and already has a workaround in mind which will prevent a backfire when the program is restarted eventually.

"We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading," Apple told TechCrunch.

Hey Siri, can you not listen to everything I sayReuters

The Guardian broke the news about Siri's grading program last week, where it was reported that Apple contractors were listening to private conversations, including sexual encounters, discussions between doctors and patients, business deals, criminal dealings and so on. Apple sent anonymised audio snippets to the contractors to maintain a level of privacy, but the idea that someone is listening to Siri queries can be unsettling for many.

This is not an uncommon practice in the industry. Like Apple, Amazon and Google reviewed their respective digital assistant recordings with the help of actual humans. The idea is to improve the voice assistants so they respond to users in a better, efficient way. But such a practice gets in the way of privacy, although T&Cs vaguely mention such doings.

Apple Siri
Apple Siri caught in a

Apple's terms of service, for instance, mention sharing of Siri data, but nothing about sharing live recordings. According to Apple, its contractors reviewed less than 1 percent of daily requests picked up by Siri.

The fact that Apple has acted immediately in this regard and aims to add an option for users to choose whether they wish to be a part of the "grading" process restores faith in Apple's commitment towards privacy. It's worth pointing out that Google Assistant and Alexa users can opt-out of some uses of their recordings.

Watch out for future software updates from Apple and be sure to update all iDevices, including HomePod, iPhones and Apple Watch to gain control over this situation.