Siri, Alexa, and Google Assistant can be incorrectly activated by 1,000 phrases discovered by new research; "Montana" and 'Election" are among the words that can trigger these voice assistants. According to Arstechnica's latest report, privacy advocates have grown concerned that more risk than benefit can be posed by the users near-constant listening to nearby conversations since many voice assistant apps, such as Google Home, Siri, and Alexa, have become fixtures in millions of homes.

Also Read: BEWARE: 1,000 Data on Lenovo are Wiped Off by Hackers, and Now Blackmailing Users for $200-$275 Ransoms to Return Data

The research findings showed that false triggers are commonly produced by dialog in TV shows and other sources, triggering the devices to turn on and even sending nearby sounds to Apple, Amazon, Google, or other manufacturers. More than 1,000-word sequences were uncovered by the researchers, including the phrases from Modern Family, House of Cards, Game of Thrones, and news broadcasts that can incorrectly activate the devices.


Also Read: Ransomware: Netwalker Cybercriminal Gang Extorts $1.14M From University in the Midst of Finding a COVID-19 Cure

"The devices are intentionally programmed in a somewhat forgiving manner because they are supposed to be able to understand their humans," said Dorothea Kolossa, one of the researchers.

"Therefore, they are more likely to start up once too often rather than not at all," she added.
Here are some of the words or phrases that can provide false triggers of the mentioned voice assistants:

Google Home: "Okay, cool" and "Okay, who is reading"

Siri: "hey jerry" and "a city"

Microsoft Cortona: "Montana"

Alexa: "Election", "a letter", and "unacceptable"

The 1,000 phrases that can incorrectly activate Google Assistant, Alexa, and Siri

According to Arstechnica, a portion of people's conversation can be recorded by the incorrectly activated devices, transmitting it to the manufacturer, claimed by the researchers. In an attempt to enhance word recognition, employees may check and transcribe the audio.

This may result in an unacceptable intrusion since fragments of potential private conversation can end up in the company logs. According to the report, law enforcement authorities investigating a murder in 2016, subpoenaed Amazon for Alexa data transmitted in the moments leading up to the crime, showing that the risk of privacy isn't solely theoretical. According to The Guardian's 2019 report, sensitive conversations recorded by Siri were transcribed by Apple's employees.

Private discussions between patients and doctors, seemingly criminal deals, sexual encounters, and business deal, were included. The researchers analyzed voice assistants from Apple, Amazon, Microsoft, Google, and Deutsche Telekom, including three Chinese models by Baidu, Xiaomi, and Tencent. The study results were published on Tuesday, June 30, focusing on Apple, Amazon, Google, and Microsoft, which didn't immediately respond to a request for comment.

Also Read: BEWARE: New Powerful Android Malware, FakeSpy, Targets Royal Mail UK and Various Postal and Delivery Service

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion