Voice Assistants Can Be Hacked with Laser Rays | TechTree.com

Voice Assistants Can Be Hacked with Laser Rays

Internet and hacking go hand-in-hand though researchers now claim that even a hand-held laser device could mess up the voice assistants on one’s smart devices

 

A team of researchers at the University of Michigan has revealed that shining laser rays on devices could result in hacking into resident voice assistants such as the Alexa, Siri and Google Assistant. Of course, one may feel say for now as not too many of our friends possess devices that can be used for laser shining!

But, things may get dicey soon enough as the researchers claim attackers can remotely inject inaudible and invisible commands into voice assistants using laser rays directed at the microphones of these devices. And the research team believes that even a steady laser light from some hand-held toy emanating laser rays could do the trick.

In their research note, the team from the University of Michigan say that lasers can be used to tamper with the micro-electric mechanical systems on the microphones by making them respond to light by mimicking it into sound. This could be exploited to inject sound into microphones by simply modulating the amplitude of a laser beam, the paper said.

The researchers reportedly could gain full access to voice assistants kept at distances of around 110 meters. “We show that user authentication on these devices is often lacking or non-existent, allowing the attacker to use light-injected voice commands to unlock the target’s smart lock-protected front doors, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) that are connected to the target’s Google account,” the paper says.

The researchers have shared their findings with Amazon, Apple, Google, Tesla and Ford and are in touch with the security teams of these vendors, as well as with ICS-CERT and the FDA.

There were reports that some of the companies took the research on board and are currently reviewing the results with Amazon proposing to engage with the research team to get more insights about the challenges that laser rays could present to voice assistants. Google was quoted by Foxnews.com as suggesting that they are reviewing the paper as a means to review and improve the security of their home devices.

In the recent past, the voice assistants have made news for the wrong reasons, given that geeks have found them to be performing activities under the hood that included an ability snoop around the house. With Amazon attempting to become the Voice of Everything, this latest research could prove a dampener for those with serious privacy concerns.

Of course, things aren’t as bad as they might look at first glance, because hacking a voice assistant requires expertise and specialized equipment and an unobstructed view of the device, the researchers note.


TAGS: Alexa, Google Assistant, Tesla, Siri, Apple Siri, Apple, Amazon

 
IMP IMP IMP
##