Contractors Monitoring Apple’s Siri Listen to Confidential Details

Apple has a privacy concern at hand.  

Published
Tech News
3 min read
Apple had recently launched a new marketing campaign centred around its commitment to privacy.
i

Apple has a privacy concern at hand.

Contractors hired by the Cupertino-based giant have been reported to regularly hear confidential medical information, drug deals, and recordings of couples having sex as a part of quality check for Apple's voice assistant Siri, a report in The Guardian reveals.

Despite Apple not showing it in its consumer-facing privacy documentation, a small part of Siri recordings are accessible to company contractors, the report says.

The contractors are there to grade the responses, including the fact that if the activation was deliberate or accidental.

The report says that Apple does not explicitly say humans listen to the recordings.

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” a company response to The Guardian said.

Apple says that less than one percent of Siri activations are used for grading.

A whistle-blower, however, expressed concerns about the lack of disclosure of humans working behind the grading to the daily, given the fact that accidental activations are more prone to pick up sensitive information.

The whistle-blower told The Guardian that there have been countless such instances where recordings featured private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on.

Not just this, these recordings are accompanied by user data that shows location, contact details and app data, according to the whistleblower.

In its privacy document, however, Apple has stated that Siri data “is not linked to other data that Apple may have from your use of other Apple services”.

The report said that accidental activations led to the receipt of the most sensitive data that was sent to Apple.

Although Siri is included on most Apple devices, the contractor said the Apple Watch and the HomePod smart speaker were the most frequent source of mistaken recordings.

"Sometimes, you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal… you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch,” the contractor was quoted as saying.

The contractor said staff only reports accidental activations as ‘technical problems.’

“The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content,” The Guardian quotes.

The contractor also told the daily that the contractors were motivated to go public about their jobs due to fears of user information being misused. He also said that Apple should also reveal the human oversight.

Apple won't be the only company to use humans to oversee inputs for its assistant. Amazon in April, admitted to employing staff to listen to some Alexa recordings, and earlier this month, Google workers were found to be doing the same with Google Assistant.

However, Apple differs from those companies in some ways. While Amazon and Google allow switch off or opt out of some uses of their recordings, Apple offers no similar choice short of disabling Siri entirely.

(With inputs from The Guardian)

(The Quint is available on Telegram. For handpicked stories every day, subscribe to us on Telegram)

Stay Updated

Subscribe To Our Daily Newsletter And Get News Delivered Straight To Your Inbox.

Join over 120,000 subscribers!