Security Loophole In Your Virtual Assistant

Security Loophole In Your Virtual Assistant

It is said only our soulmate can hear our unsaid thoughts. However, technology is trying to prove us wrong. The much popular mart speakers and smart assistants, always listen to you even when you are not talking to them. According to the latest study, researchers succeeded to prove that digital assistants Alexa, Google Assistant, and Siri can hear an inaudible command and follow them. Wondering how is this possible? Read this to know!

Is It Really Possible?

Echo, HomePod, Home, all the devices have a basic functionality to receive and inspect audio, the direct commands given to the devices. The devices are so good at working in this that they even follow the commands which user never asked.

virtual assistant

There have been series of tests that support the theory. In one of the studies conducted by Georgetown University and University of California, Berkeley in 2016, researchers used white noise to hide secret commands. With this, students made smart devices to switch to Airplane Mode and steer to websites by concealing commands in white noise which can be played via YouTube videos and speakers.

The white noise actually nullifies any other sounds of surrounding as it is a combination of all the sound frequencies detected by human ear. To make the commands inaudible to human, the researchers slid smart speaker commands into white noise.

This month, Berkeley researchers published a paper which took their research to the next level. The paper exhibited that they could place silent commands into music files and spoken text. For instance, when you are listening to your favorite music, your smart speakers is getting hidden commands to make a purchase online or change settings.

Another example of the security threat is Chinese researchers fabricated a device with basic parts to send hidden commands to the digital assistants. However, the model was rudimentary and has limitations such as proximity with desired smart speakers. But the researchers worked on it and now it works even when it is 25 feet away.

To sum up, someone could give an inaudible command that would make the smart speaker do things that user never asked to.

This is a serious issue. However, nothing has actually happened that could suggest that theory will actually work. But who knows when this theory will live and could pose a security risk.

All three virtual assistant developers denied any kind of exploitation possible. Google and Amazon ensure that it’s the user who is giving the voice command, so a silent command would not work. On the contrary, Apple restricts the actions Siri can perform with user’s voice. Moreover, the user needs to unlock his iPhone or iPad to perform any of the tasks.

Although, all the three companies have assured safety, but, Amazon and Google have even taken measures. However, as more and more theories are coming forward along with papers to support, the issue is becoming a matter of concern. The voice recognition systems work is developed to understand each sound as a letter, which is collected and formed as a complete word or phrase.

These papers and studies have compelled us to think that if manipulating speech recognition is possible then it is possible to override the original message that should have been received. Thus, in turn, giving a completely different command which is inaudible to the human ear.

Although the papers are not published to show how vulnerable the devices are, it is to alert users that how cybercriminals could bypass the security and victimize them through their virtual assistant. How ironic is that, the more you get attracted to a new form of technology, the more you get to know the darker side of it.

Srishti Sisodia is a technical content journalist at Systweak Software. Apart from being a capable engineer, her affinity for inscription draws her towards writing interesting content about contemporary technologies and progressions. She is an avid reader and a fare connoisseur. She relishes different cuisines and when it comes to baking, she takes the cake!

Show your support

Clapping shows how much you appreciated Srishti Sisodia story.

0 thought on “Security Loophole In Your Virtual Assistant”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.