Fooling Voice Assistants with Lasers
Siri, Alexa, and Google Assistant are vulnerable to attacks that use lasers to inject inaudible—and sometimes invisible—commands into the devices and surreptitiously cause them to unlock doors, visit websites, and locate, unlock, and start vehicles, researchers report in a research paper published on Monday. Dubbed Light Commands, the attack works against Facebook Portal and a variety of phones.
Shining a low-powered laser into these voice-activated systems allows attackers to inject commands of their choice from as far away as 360 feet (110m). Because voice-controlled systems often don’t require users to authenticate themselves, the attack can frequently be carried out without the need of a password or PIN. Even when the systems require authentication for certain actions, it may be feasible to brute force the PIN, since many devices don’t limit the number of guesses a user can make. Among other things, light-based commands can be sent from one building to another and penetrate glass when a vulnerable device is kept near a closed window.
me • November 11, 2019 6:36 AM
@all
quick memo about why this thing is possible and works:
mems microphone are microchips and like any silicon made microchip works like photovoltaic panel and it is influcenced by light.
afaik the reason ic are black is to block light because if light can pass in the chip it will cause interferences and crash a cpu for example.
you can check this with a normal led: you apply current it lights on, you apply light to a led and it generate current (for example you can attach a led to the pc mic input to clone a ir remote).
condendesr microphones or other types of mycrophones are not affected by this (but they might be affected if you target the amplifier ic instead of the mic itself)