Apple Confirms "a Small Portion" of Siri Recordings get Reviewed by Contractors
Contractors that are working on Siri regularly hear confidential medical information, drug deals, recordings of couples having sex, and other private information, according to a report from The Guardian that shares details collected from a contractor who works on one of Apple's Siri teams.
The employee who shared the info is one of many contractors around the world that listen to Siri voice data collected from customers to improve the Siri voice experience and help Siri better understand incoming commands and queries.
According to The Guardian, the employee shared the information because he or she was concerned with Apple's lack of disclosure about the human oversight, though Apple has several times in the past confirmed that this takes place and the practice has been outlined in past reports as well.
In a statement, Apple confirmed to The Guardian that a small number of anonymized Sirirequests are analyzed for the purpose of improving Siri. A small, random subset (less than 1 percent) of daily Siri activations are used for grading, with each clip only lasting for a few seconds.
"A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."
Apple has not made its human-based Siri analysis a secret, but its extensive privacy terms don't appear to explicitly state that Siri information is listened to by humans. The employee said that Apple should "reveal to users" that human oversight exists.
The contractor who spoke to The Guardian said that "the regularity of accidental triggers on the watch is incredibly high," and that some snippets were up to 30 seconds in length. Employees listening to Siri recordings are encouraged to report accidental activations as a technical problem, but aren't told to report about content.
As stated in Apple's security white paper, for example, user voice data is saved for a six-month period so that the recognition system can use them to better understand a person's voice. The voice data that's saved is identified using a random identifier that's assigned when Siri is turned on, and it is never linked to an Apple ID. After six months, a second copy is saved sans any identifier and is used by Apple for improving Siri for up to two years. A small number of recordings, transcripts, and associated data without identifying information is sometimes used by Apple for ongoing improvement of Siribeyond two years.
Apple's privacy website has a Siri section that offers up more info, explaining that all Siriqueries are assigned a random identifier not associated with an Apple ID. The identifier is reset whenever Siri is turned off and then on again, and turning Siri off deletes all user data associated with a Siri identifier.
Those concerned about Siri triggering accidentally on devices like the iPhone, Apple Watch, and HomePod can turn off the "Hey Siri" feature and can instead activate Sirimanually, and Siri can also be turned off entirely.
- Apple Removes iCloud Activation Lock Status Tool From Website Rumor: Apple Blocks Activation on iOS 9.0-9.3.5 Firmware Apple Still Signing iOS 11.3 Beta 5/6, Downgrade to It to Jailbreak Your iPhone iOS 10.3 Jailbreak / iOS 10.3.1 Jailbreak How to Download Apple’s Official iOS IPSW with One Simple Step? macOS High Sierra 10.13.2 Beta 4 Now Available Apple iPhone 7 Plus with Leaked Photos and iPhone 7's Packaging Box Apple Released the Final version of iOS 9.2.1