Security & Privacy + Security News

Ingenious Attack Shows How Siri Could Be Hijacked Silently from 16 Feet Away, but Don’t Lose Any Sleep

Posted on by

Siri

It’s been four years since Apple introduced Siri, putting a mini-secretary into every iPhone owners’ pocket.

I know it’s made my day-to-day life easier — providing an easy way to reply to an SMS text message while I’m on the move, or letting me set the timer when my fingers are mucky from cooking in the kitchen. But the technology has also been at the heart of a number of security problems, most recently providing a sneaky way to bypass the iOS 9 lock screen.

Well now, French security researchers appear to have uncovered a whole different way in which Siri (and, to be fair, Google Now on Android phones) could potentially be exploited by hackers to hijack control of your smartphone, without you ever realising that any funny business was afoot.

Now, it’s important to stress that this is a *potential* problem. Although quite ingenious, when you hear the details of just what the researchers had to do, and how the iPhone has to be prepared before a successful attack can proceed, you will probably decide that this particular threat is not one to lose much sleep about.

But that doesn’t make it any less fascinating.

In a technical paper published by IEEE, a team from the French government’s Network and Information Security Agency (ANSSI), claims to have discovered “a new silent remote voice command injection technique,” which could allow them to control Siri via radio waves from a distance of up to 16 feet, if — and this is crucial — a pair of headphones with an in-built microphone (such as the standard earbuds shipped by Apple) are plugged into the iPhone.

Armed with an amplifier, laptop, antenna and Universal Software Radio Peripheral (USRP) radio, attackers could apparently send surreptitious signals, that the headphones’ cord would pick up like an antenna, and converted into the electrical signals understood as speech by the iDevice’s operating system.

Although somewhat unlikely and impractical, it is quite ingenious.

No words have been spoken, and yet Siri has received a command.

Perhaps the most plausible abuse of the French researchers’ discovery would be to order iOS to visit a particular website, hosting a malicious exploit that could infect the phone and install malware. Alternatively, unauthorised messages could be sent from the compromised device.

In a demonstration video, the researchers showed how they were able to transmit silent commands to an Android Smartphone, forcing it to visit the ANSSI website.

However, there are additional limitations for the attack to work against iPhones. Not only does Siri need to be enabled with Voice Activation turned on to allow ‘Hey Siri”, and headphones with a built-in microphone need to be plugged into the targeted device, but also the hardware required to perform the attack is not insubstantial.

Indeed, the researchers say that in its smallest form (which can fit inside a backpack) the range is limited to about 6.5 feet. A more powerful version that would require larger batteries could only fit practically inside a car or van, giving a range of 16 feet or more.

Regardless, as Wired reports, the researchers believe that the vulnerability could create a real security headache:

“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” the two French researchers, José Lopes Esteves and Chaouki Kasmi, write in a paper published by the IEEE. Or as Vincent Strubel, the director of their research group at ANSSI puts it more simply, “The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”

I would agree that, potentially, anything that can be said to Siri could, in theory, be sent secretly through radio waves, but I think there’s quite a jump between that and describing it as “inducing parasitic signals.”

“Parasitic” implies some malware-like component and, as we all know, it has proven to be immensely difficult for hackers to infect iPhones with malicious code without going to the effort of jailbreaking or exploiting the enterprise provisioning feature that Apple provides for companies who wish to roll out their own apps to staff.

That’s why threats such as the YiSpecter iOS malware, which managed to creep into the App Store, are so rare and had to use such a convoluted route to get there.

I don’t see why the introduction of remote Siri commands necessarily significantly increases the risks of iPhones and iPads becoming infected.

Of course, if you’re an Android user — particularly one who has found it problematic to update your operating system with the latest patches, and who might be of interest to intelligence agencies willing to attempt an attack like this — then you may be more at risk.

My advice? If you’re concerned, consider turning off Siri when your phone is locked or at least disabling Voice Activation. And, furthermore, unplug your headphones when you’re not using them!

About Graham Cluley

Graham Cluley is an award-winning security blogger, researcher and public speaker. He has been working in the computer security industry since the early 1990s, having been employed by companies such as Sophos, McAfee and Dr Solomon's. He has given talks about computer security for some of the world's largest companies, worked with law enforcement agencies on investigations into hacking groups, and regularly appears on TV and radio explaining computer security threats. Graham Cluley was inducted into the InfoSecurity Europe Hall of Fame in 2011, and was given an honorary mention in the "10 Greatest Britons in IT History" for his contribution as a leading authority in internet security. Follow him on Twitter at @gcluley. View all posts by Graham Cluley →