This article is more than 1 year old

Shine on: Boffins bedazzle Alexa and her voice-controlled assistant kin with silent laser-injected commands

How beams of light can boss around smart speakers

Boffins affiliated with the University of Electro-Communications in Japan and America's University of Michigan have devised a way to use lasers to inject audio commands into mic-equipped devices.

Thus you can wordlessly hijack someone's voice-controlled smart speaker, just by shining a laser onto its microphone.

In a paper [PDF] presented at this week's Usenix Security Symposium, "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems," computer scientists Takeshi Sugawara, Benjamin Cyr, Sara Rampazzi, Daniel Genkin, and Kevin Fu describe how they managed to use amplitude-modulated light to issue unauthorized voice commands to voice-controlled assistant hardware like Alexa, Facebook Portal, Google Home, and Apple iPhone (which houses Siri) and to vehicles configured to accept voice commands through mobile apps.

"We discovered and investigated what we call Light Commands, commands issued by light signal injection into MEMs microphones," explained Cyr, a graduate student research assistant at the University of Michigan, in a video [.MP4] summarizing the research. "We found that in some devices we can inject a command 110 meters away with the equivalent power of a laser pointer, and show that this works through glass windows and between buildings."

Micro-electromechanical systems (MEMS) is a process technology for etching mechanical components into a chip. It's used to create tiny, inexpensive microphones that end up in smartphones and home assistant hardware, among other things.

Alexa photo via Shutterstock

Amazon's auditing of Alexa Skills is so good, these boffins got all 200+ rule-breaking apps past the reviewers

READ MORE

The boffins, pointing to previous research demonstrating that lasers can induce errors in semiconductors, explain that MEMs microphones have ASICs (application-specific integrated circuits) for converting capacitive changes of the diaphragm into an electric signal.

Light, when directed at such mics, can generate a voltage signal based on the light intensity.

"As strong light hits a semiconductor chip, it induces a photocurrent across a transistor, where the current’s strength is proportional to the light intensity," the paper explained. "The analog part of the microphone’s ASIC recognizes this photocurrent as a genuine signal from the diaphragm, resulting in the microphone treating light as sound."

The scientists theorize that there's both a photoelectric effect on the ASIC and a photoacoustic effect on the diaphragm arising from laser-driven thermal changes that move the diaphragm. Given that behavior, the team faced the challenge of modulating a voice command into a sequence of laser light amplitude changes and targeting that light precisely at a visible smart speaker.

"When we aim this laser at the microphone port of a voice controllable system, it responds to the light as if someone had spoken the command," explained Cyr.

The attack required about $1,500 in hardware for a scientific-grade laser driver, plus a laptop and laser diode, though we're told their technique could be replicated with more affordable components.

They also managed to use phone apps for a Tesla and Ford to issue voice commands that unlocked the vehicles, among other things. Below is a video depicting their work.

Youtube Video

The scientists tested 17 different hardware devices that implement either Alexa, Google Assistant, and Siri voice assistant software, and found they all exhibited the same vulnerability.

"While the power required from the attacker varies from 0.5 mW (Google Home) to 60 mW (Galaxy S9), all the devices are susceptible to laser-based command injection, even when the device’s microphone port (e.g., Google Home Mini) is covered with fabric and / or foam," the paper stated.

Device coverings, however, can defend against light command injection if the covering is sufficiently dense. The boffins pointed to Apple's more heavily padded HomePod as an example. Keeping smart speakers out of view from a window can also prevent such attacks, and the need for fairly precise aiming means that mobile devices would be difficult to target even if left exposed and stationary for long periods.

The boffins are said to have told Apple, Google, and Facebook about their findings to allow mitigations to be designed. Further information can be had at the Light Commands website. ®

More about

TIP US OFF

Send us news


Other stories you might like