Smart Speakers Hacked To Listen In, Steal Passwords

Google Home Mini

Malicious third-party apps for Google Home and Amazon’s Alexa can listen in on conversations and trick users into revealing passwords

Researchers have demonstrated how malicious third-party apps for smart speakers such as Amazon’s Alexa and Google Home can be used to eavesdrop on conversations and harvest sensitive data such as passwords.

The hacks make use of what the researchers said were insufficiently rigorous vetting processes at both companies.

Both Amazon and Google examine new third-party apps before offering them to users, but don’t re-examine apps when updates are applied, said Luise Frerichs and Fabian Bräunlein of Security Research Labs (SRLabs).

As a result, hackers can create innocuous-looking apps that provide horoscope information, for instance, but can be updated to be able to listen in on users.

alexa, amazon, echo, smart speakers
Amazon

Voice phishing attack

To carry out a password-stealing attack, an app could be modified in such a way that, when activated, it reads out an error message such as “This skill is currently not available in your country.”

The app then remains silent for a few seconds or longer, leading the user to believe it is no longer active.  This is carried out by inserting a string of non-pronounceable characters, such as U+D801 (dot, space).

It then reads out a phishing prompt, for instance asking the user to speak their password in order to activate a system update, leading the user to believe it is the system asking for their password and not the third-party app.

The password information is sent directly to the servers of the third-party app maker.

Eavesdropping

On Amazon devices, the eavesdropping attack requires the user to say a key word such as “stop”, after which the app says “goodbye”, but remains active.

It then listens for a sentence beginning with a commonly used term, and records what is said after that, sending the words back to the third-party server.

On Google devices this attack is more powerful, not requiring any specified trigger words and being able to record what the user says indefinitely.

In a proof of concept video, Frerichs is shown triggering a third-party random integer generator.  After the app carries out its function, it says “goodbye”, and apparently switches off, but in reality continues recording what is said and transmitting it to the third-party server.

Frerichs and Bräunlein said the implications of having internet-connected smart speakers listening in were “further-reaching than previously understood”.

“Users need to be more aware of the potential of malicious voice apps that abuse their smart speakers,” they said in an advisory.

Caution

“Using a new voice app should be approached with a similar level of caution as installing a new app on your smartphone.”

They recommended Amazon and Google to begin reviewing apps when they are updated and to ban suspicious elements such as silent characters and terms such as “password”.

Amazon did not immediately respond to a request for comment.

Google said its Google Home apps, called Actions, are required to follow its developer policies.

“We have review processes to detect the type of behaviour described in this report, and we removed the Actions that we found from these researchers,” the company said.

“We are putting additional mechanisms in place to prevent these issues from occurring in the future.”

Google added that Home devices will never ask for their account passwords.