Apple insists it’s only for a few seconds.
CUPERTINO, Kalifornia (PNN) - July 28, 2019 - Should it come as any surprise? A whistleblower working for Apple has revealed that its popular voice activated spying device helpful virtual assistant Siri, now in millions of households, "regularly" records people having sex, and captures other "countless" invasive moments, which it promptly sends to Apple contractors for their listening pleasure "quality control".
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading” the company’s Siri voice assistant.
According to Amazon's Alexa terms of use, the company collects and stores most of what you say to Alexa (or perhaps what you groan - including the geolocation of the product along with your voice instructions.
However, what's not disclosed or at least not well known up to this point is that a "small proportion" of all Siri recordings of what consumers thought were private settings are actually forwarded to Apple contractors around the world, according to a new report. Supposedly this is to ensure Siri is responding properly and can continue to distinguish dictation. Apple says the data “are used to help Siri and dictation understand you better and recognize what you say”.
But an anonymous current company insider and whistleblower said, “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
Contradicting Apple's defense that these sultry samples are "pseudonymized recordings," Apple employees can know precisely who is having sex and where, and what time the deed was done.
Apple's formal response was as follows:
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
Just trust us, Apple appears to be saying. Most of what can be deemed sensitive data is captured through so-called accidental activations by "trigger words," according to the report, with the highest rates of such occurrences via the Apple Watch and HomePod smart speakers.
“The regularity of accidental triggers on the watch is incredibly high,” the company whistleblower explained. “The watch can record some snippets that will be 30 seconds - not that long but you can gather a good idea of what’s going on.”
The insider continued, “you can definitely hear a doctor and patient talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise - you can’t say definitely, but it’s a drug deal - you can definitely hear it happening. You’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”
Less than comforting is just how many across the globe have access to these private moments. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” the contractor continued. “It wouldn’t be difficult to identify the person you’re listening to, especially with accidental triggers - addresses, names and so on.”
“Apple is subcontracting out; there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
The evidence continues to mount. Siri is a blackmailer's dream come true... or spy agency, or voyeur, or political adversary, or just plain pervert.