Digital voice assistants in your smartphone, car and home are vulnerable to cyber attacks that can let hackers compromise your privacy. But the reliance on smart things is growing by the day.

Smartphones are practically a part of our daily lives. Homes and cars are getting smarter too, and digital voice assistants like Siri, Alexa and Google Now run the show for most part.

Ordering our phone to make a call while our hands are full, or asking Alexa to get a weather update while we are getting ready for office seems magical, but it is not without risks.

Siri
Reuters

Researchers have demonstrated in a paper published by the Zhejiang University how digital assistants in smartphones, smart home products and even cars can be exploited using a completely inaudible frequency.

Simply put, owners of the devices won't know their digital voice assistant is being tricked by a sound that's inaudible to the human ear.

The researchers call this method of attacking voice assistant-powered devices DolphinAttack, because it uses ultrasonic frequencies to mimic commands like "Hey Siri", "Alexa" and perform various actions.

In the demo video shown below, DolphinAttack initiates a call without directly talking to Siri in an audible tone.

"In this paper, we propose DolphinAttack, an inaudible attack to SR systems. DolphinAttack leverages the AM (amplitude modulation) technique to modulate audible voice commands on ultrasonic carriers by which the command signals cannot be perceived by human," the researchers said in their study.

"With DolphinAttack, an adversary can attack major SR systems including Siri, Google Now, Alexa, and etc," they said.

Amazon Echo
Amazon Alexa powered EchoFacebook/Amazon Echo

Using DolphinAttack technique, hackers can execute various commands like activating Siri to initiate a FaceTime call, activating Google Now to turn device to Airplane mode, and alarmingly manipulating the navigation system in an Audi.

But that's not the extent of it. If hackers must, they can even command any device to visit a malicious site and download a virus or publish materials online.

With some services like PayPal supporting the option to send money via smartphone using voice commands, hackers can do more damage to a user financially.

PayPal
Reuters

The portable transmitter used to initiate DolphinAttack did not use a sophisticated setup. Researchers built the transmitter using a smartphone, an amplifier, an ultrasonic transducer and a battery.

These items aren't too hard to find in any local electronics store. But the transmitter had a limitation of 5 feet.

The success rate of inducing a DolphinAttack in office-like spaces using Siri was 100 percent, but dropped to 80 percent when moved to a slightly busy area like a café and finally dropped to 30 percent in streets.

How to prevent DolphinAttack?

Since smartphones are the most common devices equipped with digital voice assistants, they're likely to be the easiest targets for hackers. The best way to prevent an attack such as DolphinAttack, users can navigate into the phone settings and turn off the waking phrases.

In iOS powered devices, you can go to Settings > General > Siri and turn off "Allow Hey Siri".

In Android-powered phones, you can open the Google app, tap Menu Settings from the top left, select Voice "Ok Google" detection and turn it off.