‘Light commands’ attack: hacking Alexa, Siri, and other voice assistants via Laser Beam

Pierluigi Paganini November 05, 2019

Experts demonstrated that is possible to hack smart voice assistants like Siri and Alexa using a lasers beam to send them inaudible commands.

Researchers with the University of Michigan and the University of Electro-Communications (Tokyo) have devised a new technique, dubbed “light commands,” to remotely hack Alexa and Siri smart speakers using a laser light beam, the attackers can send them inaudible commands.

The “light commands” attack exploits a design flaw in the smart assistants microelectro-mechanical systems (MEMS) microphones. MEMS microphones convert voice commands into electrical signals, but researchers demonstrated that they can also react to laser light beams.

“Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light.” reads a website set up to describe the ‘lightcommands technique’.

“In our paper we demonstrate this effect, successfully using light to inject malicious commands into several voice controlled devices such as smart speakers, tablets, and phones across large distances and through glass windows.”

The tests conducted by the experts demonstrate that it is possible to send inaudible commands via laser beam from as far as 110 meters (360 feet). Popular voice assistants, including Amazon Alexa, Apple Siri, Facebook Portal, and Google Assistant, are vulnerable to remote hack.

“We propose a new class of signal injection attacks on microphones based on the photoacoustic effect: converting light to sound using a microphone. We show how an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphone’s aperture.” reads the paper published by the experts. “We then proceed to show how this effect leads to a remote voice-command injection attack on voice-controllable systems. Examining various products that use Amazon’s Alexa, Apple’s Siri, Facebook’s Portal, and Google Assistant, we show how to use light to obtain full control over these devices at distances up to 110 meters and from two separate buildings.”

In a real-life attack scenario, an attacker could stand outside an office or a house and use a laser light onto a voice assistant to instruct a voice assistant to unlock a door or make any other malicious actions.

MEMS microphones are composed of a diaphragm and an ASIC circuit, when the former is hit with sounds or light, it sends electrical signals that are translated into commands

The experts demonstrated how to “encode” commands using the intensity of a laser light beam and causes the diaphragm to move. The movements of the diaphragm generate electrical signals representing the attacker’s commands.

The researchers made various tests, they were able to measure light intensity using a photo-diode power sensor and evaluated the response to the different light intensities on microphone.

as the light intensity emitted from a laser diode is directly proportional to the diode’s driving current, we can easily encode analog signals via the beam’s intensity by using a laser driver capable of amplitude modulation.” continues the paper.

“We recorded the diode current and the microphone’s output using a Tektronix MSO5204 oscilloscope,” they said. “The experiments were conducted in a regular office environment, with typical ambient noise from human speech, computer equipment, and air conditioning systems.”

Below some videos PoC of the ‘light commends’ attack shared by the experts:

Experts also explored the feasibility of the attack, hackers could use cheap equipment to send commands to the voice assistants. Researchers explained that they used a simple laser pointer available for as little as $14 on Amazon and eBay, along with a laser driver designed to drive a laser diode by providing a current and a sound amplifier.  

voice assistants light commands

The list of voice assistants using MEMS microphones that might be vulnerable to the light commands attack includes Alexa, Siri, Portal and Google Assistant.

The good news is that researchers are not aware of the exploitation of the attack in the wild. They reported their findings to voice assistant vendors and are collaborating with them to secure their devices.

Countermasures include the implementation of further authentication, sensor fusion techniques or the use of a cover on top of the microphone to prevent the light hitting it.

“An additional layer of authentication can be effective at somewhat mitigating the attack,” they concluded. “Alternatively, in case the attacker cannot eavesdrop on the device’s response, having the device ask the user a simple randomized question before command execution can be an effective way at preventing the attacker from obtaining successful command execution.”

[adrotate banner=”9″] [adrotate banner=”12″]

Pierluigi Paganini

(SecurityAffairs – light commands attack , voice assistants)

[adrotate banner=”5″]

[adrotate banner=”13″]



you might also like

leave a comment