Security & Privacy

Researchers use lasers to hack Siri, Alexa, Google assistants

Posted on by

Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple’s Siri, Amazon’s Alexa, and Google Assistant from up to 360 feet away. These attacks exploit a vulnerability in microphones using micro-electro-mechanical systems (MEMS), which the researchers have discovered will respond to lasers the same way they will respond to sound.

The attack method, called Light Commands, was only tested on the three big-name players in the personal assistant space, but the researchers believe this attack will affect any microphone that uses MEMS (presumably including devices with Microsoft Cortana, Baidu DuerOS, or other digital assistants).

Although the attack is essentially a proof-of-concept at this stage, and has significant limitations (like requiring direct line-of-sight to the device), the researchers admit that they don’t fully understand why this exploit works, which could open the door to others finding ways to make it even more effective.

Voice-activated systems vary in the level of control they allow over a device without first gaining user authentication. Apple’s track record with Siri is mostly decent in this sphere, with most critical voice-activated functions requiring a passcode or biometric verification (i.e. Touch ID or Face ID) before completing the request.

In some circumstances, the Light Commands attack method can be used to brute-force a device passcode, so the usual security advice applies: choose a complex and not easily-guessable passcode, and set up your device to lock after a certain number of incorrect attempts. Even that basic level of security should protect you against this new and admittedly sci-fi sounding method of remote device hacking.

You can find out more about the Light Commands attack at Ars Technica or at the Light Commands homepage.

Related:

Researchers came up with a similar attack called DolphinAttack in 2017, where inaudibly high-pitched voice commands were used to control an iPhone.

DolphinAttack is a method of sending a Hey Siri voice command in such a high pitch that humans cannot hear it. Image: Guoming Zhang via YouTube.

How can I learn more?

This week on the Intego Mac Podcast episode 108, Intego’s experts will discuss the new Light Commands attack, as well as Apple’s privacy policy page update, and whether an iPad can replace your MacBook. Be sure to follow the podcast to make sure you never miss the latest episode.

You can also subscribe to our e-mail newsletter and keep an eye here on The Mac Security Blog for the latest Apple security and privacy news. And don’t forget to follow Intego on your favorite social media channels: Follow Intego on Twitter Follow Intego on Facebook Follow Intego on YouTube Follow Intego on Pinterest Follow Intego on LinkedIn Follow Intego on Instagram Follow the Intego Mac Podcast on Apple Podcasts