Siri sends sensitive recordings to subcontractors
- by Joann Nelson
- in Sci-tech
- — Jul 28, 2019
Apple revealed that some Siri recordings are being heard by experts hired for "grading" them for improvement of the voice assistant.
However, as one of the contractors working for the tech giant revealed, the voice assistant, which is often activated accidentally with a "wake phrase" or similar word, or even simply by moving an arm with an Apple Watch, can record sensitive private information.
Earlier this month, reports came out that Google's contractors spread around the world had access to Google Assistant voice recordings. Hern spoke with an anonymous contractor who performs quality control on Siri, who said they were concerned about how often Siri tends to pick up "extremely sensitive personal information".
"A small portion of Siri requests are analysed to improve Siri and dictation". "There's not much vetting of who works there, and the amount of data that we're free to look through seems quite broad. And you'd hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch", according to the source.
An interesting thing to note is that while Google and Amazon do allow users to opt out of post-processing of their recordings, Apple hasn't pushed a privacy-focused feature on Siri yet.
Additionally, Apple dodged around the possibility that any recordings could be used to identify a person.
House Judiciary Committee to ask judge for Mueller grand jury evidence
Mueller rightly warned that the Russians have an ongoing campaign to undermine the faith of Americans in democracy. Trump has appeared to take a victory lap following Mueller's testimony , which he called "shocking and very sad".
As a reply on the matter, Apple confirmed to The Guardian that it does analyse a small number of Siri requests for the "purpose of improving Siri".
The problem, according to the report, is that Apple does not explicitly tell its users that a percentage of its recordings are being sent to contractors, or that the recordings are being listened to by other humans. According to the AI documentation, the recordings are "analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements".
In the case of Apple, however, news may come as more of a shock to its customers given that the brand has often touted its commitment to user privacy, offering proprietary encryption on many of its products that the company purports can not be read by even those at the company. In fact, I don't want recordings made of my audio, period-I want the audio processed and immediately discarded. Apple had initially promised users that anonymity is being maintained and no user info is accessible. There's one right response to this report, and it's to change its policies and communicate them clearly.
The news is another strike against using voice assistants. Apple says that each snippet of audio from Siri graded by these third party firms runs for only a few seconds. In the wrong hands, the data can be misused in many ways.
It is really official: if you happen to be applying a voice-assistant - fairly significantly any voice-assistant - someone could be listening in.