Strangers can listen in on whatever you tell Siri or other virtual assistants. One of these strangers is Reddit user FallenMyst, who took to the discussions website to post about her experience listening to conversations, both boring and lewd, between plenty of smartphone owners and their virtual assistants.

FallenMyst started a new job as a text-to-speech analyzer for a company called Walk N' Talk Technologies where her task is to listen to short sound bites and rate how accurate a text transcription is based on the audio clips.

"Guys, I'm telling you, if you've said it to your phone, it's been recorded ... and there's a damn good chance a third party is going to hear it," she said. "I heard everything from kiddos asking innocent things like , 'Siri, do you like me?' to some guy asking Galaxy to lick his butthole. I wish I was kidding."

FallenMyst told Motherboard in her interview that her work is done through CrowdFlower, a crowdsourced data mining company that pays people to do easy, repetitive data analysis tasks for companies.

Motherboard reporter Kaleigh Rogers signed up for CrowdFlower and the Walk N' Talk job listing, which pays one cent for every 10 audio clips analyzed.

Rogers said none of the audio clips contained personal identifying information, and the audio contents range from normal, everyday things people ask their virtual assistants such as "What time is it?" and "What's the weather like today?" to more potentially privacy-intruding statements such as "Text Dakota: I'm bored." There is also no end to clips of people asking Siri or Galaxy to marry them or show them her "boobies."

Apple's iOS Software License Agreement states that using Siri automatically means the user consents to the "transmission, collection, maintenance, processing, and use of this information including your voice input and User Data, to provide and improve Siri," so news that other people have access to whatever users tell Siri should be no surprise.

In 2013, Apple spokesperson Trudy Muller told Wired magazine that Apple takes customer privacy seriously and only collects anonymized data from Siri to improve the service. Muller said Apple assigns randomly generated numbers to each user and associates these numbers with each voice file, which is kept in Apple's servers for six months, after which the identifying number is deleted, although Apple still gets to keep the audio clips for the next 18 months.

"Apple may keep anonymized Siri data for up to two years," Muller said. "If a user turns Siri off, both identifiers are deleted immediately along with any associated data."

Samsung has a similar policy about voice commands. In its Privacy Policy, Samsung states: "we may work with a third-party service provider that provides speech-to-text conversation services on our behalf."

Google, on the other hand, does not indicate whether it retrieves voice commands sent to Google Now for storage and analysis by third parties, but it does allow users to listen to and remove their own voice commands.

Cortana is in a similar boat, but Microsoft only said that it collects information from its virtual assistant and did not say that it sends the information to third parties.

While it is perfectly legal to give away these audio clips to third parties, Christopher Soghoian, principal technologist at the American Civil Liberties Union, believes Americans would probably be shocked to know that what they tell their virtual assistants are being accessed by Apple or Google.

"Customers have a certain expectation about what's happening when they interact with a company," Soghoian said. "People don't like it when they think they're talking to a computer and they're not or vice versa."

Moreover, Soghoian suggests technology companies aren't just looking into data from virtual assistants. Since many people also tell Siri or Google Now to text other people for them, they are also at risk of exposing sensitive information in their texts via their virtual assistants. Still, Soghoian says Siri is not invading privacy because the goal of Apple in listening to other people's voice commands is to improve its service.

Photo: Vasile Cotovanu | Flickr

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion