Alexa, Disarm the Victims Home Security System

  /     /     /  
Publicated : 23/11/2024   Category : security


Alexa, Disarm the Victims Home Security System


Researchers who last year hacked popular voice assistants with laser pointers take their work to the next level.



Its still a mystery to researchers at the University of Michigan and The University of Electro-Communications (Tokyo) – just what physically enabled them to inject commands into the embedded microphones of Amazon Alexa, Google Home, and other digital voice assistant devices via laser pointers.
The team in 2019 used light to remotely control Google Assistant, Amazon Alexa, Facebook Portal, and Apple Siri by exploiting a vulnerability in their so-called MEMS microphones. They used the light beams to inject invisible and inaudible commands to the digital voice assistants as well as voice-controlled smartphones and tablets – through glass windows as far away as 110 meters (120 yards).
Theyre now taking their research to a new phase.
Theres still some mystery around the physical causality on how its working. Were investigating that more in-depth, says Benjamin Cyr, a Ph.D. student at Michigan who, along with researcher Sara Rampazzi, will be presenting the latest iteration of the
research at Black Hat Europe on Dec. 10
. Why do the mikes respond to light as if its sound? he says. We want to try to nail down whats happening on a physical level, so that future hardware designs protect them from light-injection attacks.
They are now studying the security of sensing systems overall as well, including those found in medical devices, autonomous vehicles, industrial systems – and even space systems.
Cyr, Rampazzi, an assistant professor at the University of Florida, and Daniel Genkin, an assistant professor at the University of Michigan, plan to show at Black Hat Europe how a security camera could be manipulated via a hijacked voice assistant with which it interfaces. Theyll be demonstrating their light-injection hack against the Amazon Echo 3, a newer model of the smart speaker system that was not available last year when they first tested Echo, Siri, Facebook Portal, and Google Home. Cyr says they havent had the opportunity yet to test the fourth-generation Echo speaker.
As a bonus, Cyr says he plans to demonstrate what the laser beam actually sounds like when it hits the mike of the digital assistant. Ill be taking some recordings of the mike to play during the demo, he says.
At the heart of the research is the broader problem of an explosion of Internet of Things devices on the market that were not built with security in mind.
We want to understand ... how to defend against these vulnerabilities. Our final goal is to protect the system and make it more resilient, not only for the attack we found but for future attacks that have not yet been discovered, Rampazzi says.
Cat Toys and Light Commands
The researchers spent just $2,000 in equipment to conduct
the attack
, which they dubbed Light Commands and included laser pointers, a laser driver, and a sound amplifier. However, they say it could be done for as little as $100, including a low-end laser printer for cats that can be bought on Amazon.
The Amazon lasers we bought were for cats that came with cat toys, Cyr says. So we were giving away cat toys after that.
For longer range attacks, they purchased a $200 telephoto lens, which allowed them to shoot the light beam down a long hallway. They encode the signal to the mike, and it gets modulated by the light.
You shoot it to the acoustic part of the mike that then gets converted into an acoustic signal. So the voltage signal looks exactly the same is if its being done by an acoustic signal, Cyr says.
This allows them to issue commands to voice-enabled devices, such as garage door openers, smart locks, and home security system cameras.
The researchers shared their findings with Amazon, Google, and the other vendors before they went public last year with the initial research. Rampazzi says Amazon has since made some slight updates to Alexas software, for example, such that an attacker would be unable to brute-force the device PIN.
The new generation of devices also have a cover over the mike, she notes, although the researchers dont know whether that was in response to their attack. The cover makes it harder to find the location of the mike and to be able to inject [light commands] into the device. 
Vendors could make other hardware adjustments to protect the devices from the Light Command attack, she says, such as ensuring the mike isnt susceptible to light, or adding authentication techniques to the software so an unauthorized user cant commandeer the digital voice assistant.

Last News

▸ ArcSight prepares for future at user conference post HP acquisition. ◂
Discovered: 07/01/2025
Category: security

▸ Samsung Epic 4G: First To Use Media Hub ◂
Discovered: 07/01/2025
Category: security

▸ Many third-party software fails security tests ◂
Discovered: 07/01/2025
Category: security


Cyber Security Categories
Google Dorks Database
Exploits Vulnerability
Exploit Shellcodes

CVE List
Tools/Apps
News/Aarticles

Phishing Database
Deepfake Detection
Trends/Statistics & Live Infos



Tags:
Alexa, Disarm the Victims Home Security System