Gears & Gadgets

Apple to stop storing Siri audio after contractors heard private talks and sex

An Apple Watch on a person's wrist, with the Siri voice assistant activated.
Enlarge / An Apple Watch Nike+ Series 4.
Getty Images | Wachiwit

Apple will stop storing voice recordings captured by Siri-enabled devices except in cases when users intentionally opt in to the practice. “[B]y default, we will no longer retain audio recordings of Siri interactions,” Apple said in an announcement yesterday. “We will continue to use computer-generated transcripts to help Siri improve.”

The change won’t take effect right away. The default storage of Siri recordings will stop “with a future software release in fall 2019,” Apple said in a support document.

Historically, Apple stored Siri recordings by default and had humans review “less than 0.2 percent” of the audio samples “to measure how well Siri was responding and to improve its reliability,” Apple said.

Apple previously paused human review

This is the second major change made by Apple since a July report that its contractors who review the recordings for accuracy heard private discussions and even sexual encounters. A month ago, Apple said it was temporarily suspending human reviews of Siri voice recordings and that it would resume human reviews only after giving customers the choice of whether to opt in to the practice.

But until this week, Apple hadn’t promised to stop storing the voice recordings by default.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process—which we call grading,” Apple said in its announcement yesterday. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”

Apple intends to resume human review of recordings after it gives users an option to opt in to the storage and human review. “We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time,” Apple said.

Apple said it keeps track of Siri data with a “random identifier” consisting of a long string of letters and numbers associated with a single device. This system avoids linking Siri data to “your Apple ID or phone number—a process that we believe is unique among the digital assistants in use today,” Apple said.

“When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone,” Apple also said. “We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.”

No more contractors reviewing audio

Under its new process, Apple won’t have contractors review voice recordings anymore.”[W]hen customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions,” the company said. “Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.”

It isn’t clear whether Apple will immediately delete all voice recordings made before the policy change. We asked Apple for an answer to this question and will update this story if we get a response.

The recordings should all be deleted eventually as part of Apple’s normal processes, though. An Apple security guide says that “User voice recordings are saved for a six-month period so that the recognition system can utilize them to better understand the user’s voice. After six months, another copy is saved, without its identifier, for use by Apple in improving and developing Siri for up to two years.”

The current process for deleting your previous Siri recordings from Apple’s servers is laborious because it involves disabling Siri on every one of your Apple devices. As Apple notes, “When you turn Siri and Dictation off, Apple will delete the User Data associated with your Siri identifier, and the learning process will start all over again.”

Apple will continue storing computer-generated transcripts even for users who don’t opt in to storage of recordings. Like voice recordings, Apple said these “are associated with a random identifier, not your Apple ID.”

“If you do not want transcriptions of your Siri audio recordings to be retained, you can disable Siri and Dictation in Settings,” Apple said. The company also says that the computer-generated transcripts “are used in machine learning training to improve Siri, determine common usage patterns, and update language and understanding models,” and to help Apple “resolve critical problems that affect Siri reliability.”

Other major tech companies have also been grappling with voice assistant recording reviews recently. Microsoft declined to pause reviews and instead changed its privacy policy after backlash. More akin to Apple’s decision, Google also recently paused reviews of voice recordings after a contractor leaked 1,000 voice recordings to a media outlet.

Google says that the storing of voice and audio activity is set to “off” by default when people create Google accounts. Google users who have turned this on can disable the saving of voice activity and other types of personal information at Google’s activity controls site, where they can also delete past recordings.

Let’s block ads! (Why?)

Tech – Ars Technica

Leave a Reply

Your email address will not be published. Required fields are marked *