Sign In

Communications of the ACM

ACM TechNews

Uncovered: 1,000 Phrases That Incorrectly Trigger Alexa, Siri, and Google Assistant

View as: Print Mobile App Share:
Virtual assistants.

Researchers say they have identified more than 1,000 word sequences that should not trigger voice assistants, but do.


Researchers at Ruhr University Bochum and the Max Planck Institute for Security and Privacy in Germany have identified more than 1,000 word sequences that incorrectly trigger voice assistants like Alexa, Google Home, and Siri.

The researchers found that dialogue from TV shows and other sources produce false triggers that activate the devices, raising concerns about privacy.

Depending on pronunciation, the researchers found that Alexa will wake to the words "unacceptable" and "election," while Siri will respond to "a city," and Google Home to "OK, cool."

They note that when the devices wake, a portion of the conversation is recorded and transmitted to the manufacturer, where employees may transcribe and check the audio to help improve word recognition. This means each company’s logs may contain fragments of potentially private conversations.

From Ars Technica
View Full Article


Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account