We may earn a commission if you click a deal and buy an item. If you're using an ad-blocker you might miss out on seeing the deals. Learn More

Are secret Alexa, Siri and Assistant commands hiding within music?

In the 1970s and 1980s those loveable fundamentalist Christian types accused Led Zeppelin of hiding Satantic messages in the song Stairway To Heaven. These, of course, where only audible when the track was played backwards.

The “backmasking” furore, which led to mass smashing and burning of records in churches, sounds really silly these days (as it probably would have then).

However, four decades on, we could be in for a repeat performance.

Fast-forward to 2018 and research claiming music could be hiding secret commands to the voice-controlled personal assistants within our connected devices.

Researchers at the University of California, Berkeley were able to activate the AI assistants on smartphones and smart speakers, prompting them to open websites or dial phone numbers.

Worryingly, the students say the bad actors could use messages hidden within music to unlock doors, access accounts or add items to shopping lists.

Related: Google Assistant vs Amazon Alexa

The new research paper looks into how the commands could be embedded into music or spoken text recordings, in order to manipulate the always-listening assistants like Siri, Alexa and Google Assistant.

The commands aren’t discernible to humans, but will be gobbled up by the Echo or Home speakers, the research suggests.

“We wanted to see if we could make it even more stealthy,” said phD computer security student Nicholas Carlini said. “My assumption is that the malicious people already employ people to do what I do.”

The researchers are focusing their work on how humans and machines recognise speech differently.

They were able to execute the command “Okay Google, browse to evil.com” within the spoken phrase “without the dataset the article is useless.”

It was also able to embed the same command within a four second segment of Verdi’s Requiem.

You can hear the audio files on Carlini’s website. Try as we might, we couldn’t discern the difference.

A New York Times report explains how it works:

“With audio attacks, the researchers are exploiting the gap between human and machine speech recognition. Speech recognition systems typically translate each sound to a letter, eventually compiling those into words and phrases. By making slight changes to audio files, researchers were able to cancel out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be transcribed differently by machines while being nearly undetectable to the human ear.”

With this in mind, wrongdoers could potentially play music within ‘earshot’ of a Google Home’s microphone in order to command it to gain access through your smart door lock.

Amazon told the NYT is takes steps to keep Alexa secure, while Google says Assistant has features that can mitigate commands that aren’t discernible to humans. Apple points out that HomePod can’t do things like open doors, while iPhones have to be unlocked to execute certain Siri commands.

Carlini added: “We want to demonstrate that it’s possible and then hope that other people will say, ‘OK this is possible, now let’s try and fix it.'”

Are you turning the microphones off on your smarty speakers as we speak? Or does this sound far-fetched? Drop us a line @TrustedReviews on Twitter

Unlike other sites, we thoroughly review everything we recommend, using industry standard tests to evaluate products. We’ll always tell you what we find. We may get a commission if you buy via our price links. Tell us what you think – email the Editor