Apple apologizes for listening to Siri recordings, makes it an opt-in feature
Earlier this month, Apple suspended a program in which it hired contractors to listen to recordings of Siri voice commands and grade the accuracy of transcriptions. Now, the company is making human review an opt-in feature, and it’ll be done in-house by Apple employees. Unless you opt into grading, Apple says it won’t store your Siri voice recordings at all.
“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said in a blog post, adding that it will resume the grading program with opt-in permission this fall.
Although Apple, Google, and Amazon had all been using human review to improve their respective voice assistants for years, a recent round of reporting brought these programs under fresh scrutiny. Bloomberg reported in April that Amazon was using contractors to review audio clips of Alexa commands, and stories last month by Belgian public broadcaster VRT News and the Guardian described similar practices by Google and Apple, respectively. Google has suspended its human review program for three months while European regulators investigate; Amazon has added a clear way to opt out.
Apple’s decision to end human review of Siri recordings by default—and abstain from even storing them without permission—is the strongest response yet, which makes sense given Apple’s attempts to lead the tech industry on privacy. If Amazon and Google don’t respond in kind, Apple could turn what was clearly a misstep into another marketing opportunity.
(8)