Sam Stelfox

What is Light Commands? Smart home cybersecurity news [November 2019 edition]

November was a busy month regarding smart home cybersecurity news. You might recall hearing about Finland's new cybersecurity labeling process for IoT devices, a global science and safety company's introduction of an IoT Security Rating, and Amazon Ring privacy policies being under scrutiny. While these stories made headlines, today's focus is on another headline— a new hacking method for voice assistant IoT devices.

What is Light Commands?

What is Light Commands?

On November 4, 2019, researchers from The University of Electro-Communications and The University of Michigan released the research paper Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems. The paper delves into how IoT devices with Alexa, Siri and Google Assistant integrations were found susceptible to a new remote command injection called Light Commands. In this hacking method, the voice assistants could be controlled not only by close proximity voice command, but also by long distance light via laser beam.


Also referred to as laser-based audio injection attacks, the vulnerability lies in the devices' MEMS (microelectro-mechanical systems) microphones where it interprets light as it does sound. This makes it possible for concentrated light, such as from a laser beam, to trigger commands of the voice assistants just as one's voice would.

In the paper, the researchers explain how they tested different laser intensities and distances, up to 60 mW laser power and 50 meters respectively. During these tests, they were able to execute the following kinds of device commands:

  • What time is it?
  • Set the volume to zero
  • Purchase a laser pointer
  • Open the garage door (see below)


Impact of Light Commands on smart home cybersecurity

While it's not news that smart devices introduce new attack vectors in the home (recall initiatives like National Cybersecurity Awareness Month that aim to educate the public of such risks), Light Commands further highlights the importance of smart home cybersecurity.

As can be seen in the video above, attackers can use Light Commands to infiltrate the home and makes commands of the smart speaker from a significant distance. While the attacker needs certain equipment, the research team shows how such an attack setup can be recreated:

Light Commands Setup

Screenshot taken December 4, 2019 from Light Commands website.

This diagram shows the low-cost setup, which can range between $400-$600, while a traditional setup using lab equipment ranges in the thousands. The researchers also state that this attack method has yet to be performed in a malicious, targeted way. Though this is good news for now, we know from other device attack methods (e.g., supply chain attacks) that attackers learn to adapt and exploit known and new attack vectors in different ways.

Past smart home cybersecurity news roundup editions:

Like this blog?

Subscribe to our newsletter.