Apple No Longer Allows Contractors To Listen To Siri Recordings After Customer Privacy Uproar

siri apple
A report surfaced in late July from a whistleblower who worked for Apple that warned contractors can listen to some Siri recordings. The report raised concern among Apple users, and the company has now stated that it is suspending its global internal program for "grading" Siri commands. Apple had said that it employed contractors to listen to less than 1% of Siri commands to improve the assistant.

Siri often mishears its wake command and records up to 30 seconds of private conversations without the Apple device user knowing it. The whistleblower said that recordings that the contractors were listening to included things like conversations between patients and doctors, criminal activity, and people having sex. 

Apple issued a statement on the privacy row saying that while it is committed to "providing a great Siri experience," it was suspending the global Siri grading program while it conducts a "thorough review." Apple has also promised to give users a way to choose if they participate in the grading program in a future software update. Apple had been criticized for giving users no way to opt-out of its grading program; Android does offer users an opt-out for similar quality improvement programs.


The digital assistant segment is under increasing scrutiny as word that people, not computer systems, are frequently listening in on conversations with digital assistants in the name of improving services. A German regulator recently forced Google to stop a similar program after whistleblowers said that Google contractors heard sensitive information.

Google agreed to suspend its operations for three months while it investigated if the practice complied with the EU General Data Protection Regulation. Amazon also has humans who listen in on Alexa recordings. The human employees are supposed to listen to an Alexa request and determine if the digital assistant responded to the command appropriately.