Other researchers involved are: Takeshi Sugawara of the University of Electro-Communications in Tokyo and Benjamin Cyr, a doctoral student in computer science and engineering at U-M.
So there is potential for some high-tech criminal to trick a smart speaker into opening up smart locked doors to sneak into a home and burgle the heck out of it.
The researchers said: "The implications of injecting unauthorized voice commands vary in severity based on the type of commands that can be executed through voice.We show how an attacker can use light-injected voice commands to unlock the victim's smart lock-protected home doors, or even locate, unlock and start various vehicles".More news: Lane Kiffin denies reported interest in Florida State head coach job
As the high-tech "smart home" is increasingly controlled by voice commands issued through devices like Google Home or Alexa, it becomes enormously susceptible to outside attacks - to say nothing of the surveillance possibilities.
The discovery of vulnerability dubbed "Light Commands" was done by the cyber-security researcher Takeshi Sugawara and a group of researchers from the University of MI.
Researchers from Japan and the University of MI have discovered that voice assistants and smart speakers such as Google Home, Amazon Alexa, Facebook Portal and Apple Siri can be hacked by using light.
By calibrating the lasers to match the frequency of a human voice, the boffins were effectively able to beam commands to a selection of smart speakers as well as an iPhone and a pair of Android devices. The voice assistant responded with "OK, starting your workout", and Echo Buds appeared to track his movement as he walked around. According to data from tech market researcher Canalys, companies shipped 26.1 million smart speakers in the second quarter.More news: Real Reason Kim Kardashian Says Kanye Didn't Approve of Met Gala Look
According to a new study by the researchers of the University of MI and the University of Electro-Communications in Tokyo, it has been established that lasers can now be used to hijack voice assistant devices.
Google who is aware of the research confirmed that it is "closely reviewing this research paper". The researchers said that distance was the longest area they could use (a hallway) when conducting tests.
How Can Hackers Take Advantage of the Bug? The tool can make smart speakers, smartphones, and tablets perform numerous tasks even from hundreds of feet distance.
Owners could also physically block the microphone of the device, or move it away from any window. Takeshi Sugawara, a visiting scholar at the University of MI and the paper's lead author, said one way to do this would be to create an obstacle that would block a straight line of sight to the microphone's diaphragm.More news: Tyson Fury Says He’d Like To Face Brock Lesnar at WrestleMania 36