Hey Siri, Stop Listening: Apple regularly hears confidential details on Siri

August 27, 2019

Category:

2 min read

Sign up to our mailing list! 👇

What's going on here?

Apple contractors regularly hear confidential details on Siri recordings.

What does this mean?

An anonymous Apple whistleblower has revealed that Siri quality control contractors regularly hear sensitive information including medical information, criminal activities and even “sexual encounter[s]”. Less than 1% of Siri recordings are passed on to contractors working for the company.Apple says this data is passed on for the purpose of “help[ing] Siri and dictation… understand you better and recognise what you say”.

However, the whistleblower expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.  These accidental recordings are accompanied by user data showing location, contact details and app data.

What's the big picture effect?

With technology is increasingly becoming crucial to everyday life, the potential for abuse, or dangerous accidents, is huge. Siri, and any other voice-powered assistant, can be triggered by phrases and sounds that may sound like their wake word and could potentially record your conversations. These conversations could be extremely sensitive and if triggered accidentally, the user doesn’t even know they are being recorded. While it may be uncomfortable to think Siri may have accidentally recorded your conversation, the contractors expressed just as much discomfort listening to such private calls. They were motivated to go public about what they do because of the fear that such information could be misused.

While it is understandable that human analysis of Siri interactions is important and these recordings are completely random, some feel that users should have the right to opt out of being included in the pool that Apple draws from or simply delete Siri queries. As it stands, the only way to see your conversation history is to ask Siri, and the only way to delete it is to clear your Safari history and disable Siri dictation. If Apple values its reputation for user privacy highly it will have to take these concerns into consideration and creating such an option for its users.

Report written by Maab Saifeldin

If you’d like to write for LittleLaw, click here!

Share this now!

Check out our recent reports!