A new report is detailing claims from a whistleblower about how Apple contractors are regularly hearing private information while providing quality control on Siri. Drug deals and couples having sex are just some of the things that contractors supposedly hear in the process.

Apple Contractors

The source of the Guardian’s latest report is a contractor who works to “grade” or provide quality control for Apple’s voice assistant Siri. According to the whistleblower, they regularly hear recorded interactions with Siri such as confidential medical information, recordings of couples having sex, and even drug deals.

Evidently, the “grading” is done to help improve Siri to better understand what people say. The recordings are sent to contractors who then grade it based on several factors, such as if the recordings were deliberate or accidental, and whether Siri made an appropriate response.

Accidental Siri Recordings

The problem, according to the report, is that Apple does not explicitly tell its users that a percentage of its recordings are being sent to contractors, or that the recordings are being listened to by other humans.

The whistleblower also states that this lack of disclosure is concerning, especially when considering how often accidental Siri activations record extremely sensitive information. For instance, accidentally saying a “wake word” or even the sound of a zip could activate Siri without the user’s knowledge. When this happens, users won’t even know that they are already disclosing private information that may be heard by other people.

While the contractors’ job is only to review Siri interactions to report any technical problems, there is an issue of whether the contractors might misuse the data they come across.

Privacy Issue

In response to the report, Apple explained that the responses are all analyzed in secure facilities, and that all of the people who review them are obligated to comply with strict confidentiality requirements. Further, they note that only 1 percent of random Siri activations are recorded, and that these are often only a few seconds long.

This privacy issue is not new. In fact, Google and Amazon also came under fire for the same practice of using humans to improve or review interactions with their respective voice assistants. Apple users who have concerns about accidentally triggering Siri can turn off the “Hey Siri” feature and opt to trigger the voice assistant manually or turn it off completely.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion