DolphinAttack: Hackers could easily hijack Siri, Alexa and other voice assistant apps exploiting bug
The vulnerability affects voice assistants from Apple, Google, Amazon, Microsoft, Samsung and Huawei.
Voice assistants such as Siri and Alexa are potentially vulnerable to hackers, thanks to a new vulnerability that was uncovered by Chinese security researchers. The bug, essentially a design flaw, could allow hackers to launch a new kind of attack, dubbed DolphinAttack, which would allow cybercriminals to whisper commands "silently" into smartphones to hijack Siri and Alexa.
Researchers at Zhejiang University say that DolphinAttack would allow hackers to send out commands to voice assistant apps using ultrasonic frequencies, which are too high for humans to hear, but clearly audible to microphones on devices. The attack could allow hackers to potentially hijack voice assistant apps like Siri and Alexa and redirect users to malicious websites.
The vulnerability reportedly affects voice assistant apps from Apple, Google, Amazon, Microsoft, Samsung and Huawei.
The researchers said that using DolphinAttack, hackers could issue out various kinds of commands, ranging from a simple "Hey Siri" to making an iPhone call a specific number or an Amazon Echo to "open the backdoor".
In a few proof-of-concept attacks, researchers issued out "inaudible voice commands" that let them activate Siri "to initiate a FaceTime call on iPhone", activate Google Now "to switch the phone to the airplane mode" and manipulate the navigation system of an Audi.
"The attack distances vary from 2 cm to a maximum value of 175 cm and show a great variation across devices. Notably, the maximum distance that we can achieve for both attacks is 165 cm on Amazon Echo. We argue that the distance can be increased with the equipment that can generate a sound with higher pressure levels and exhibit better acoustic directionality, or by using shorter and more recognizable commands," the researchers said in their paper.
In other words, DolphinAttack's effectiveness would depend on the device. For instance, using the attack technique to command an Amazon Echo to open a door would not be feasible as it would require the attacker to already be inside the target's house. However, in comparison, hacking an iPhone would potentially be much simpler, as all a hacker would need to do is be comparatively close to a target.
How to stay safe from attacks?
Given that the nature of the DolphinAttack technique relies on people using voice assistant apps, the simplest way to stay safe from such attacks would be to turn off Siri, Alexa or Google Assistant.
"I think Silicon Valley has blind spots in not thinking about how a product may be misused. It's not as robust a part of the product planning as it should be," Ame Elliott, design director at the nonprofit SimplySecure told FastCo. "Voice systems are clearly hard to secure. And that should raise questions. It's difficult to understand how the systems work, and sometimes by deliberate design. I think hard work is needed to undo the seamlessness of voice and think about adding more visibility into how the system works."
© Copyright IBTimes 2024. All rights reserved.