This article is more than 1 year old

Your phone wakes up. Its assistant starts reading out your text messages. To everyone around. You panic. How? Ultrasonic waves

Not OK Google: Android, Siri sink in SurfingAttack

Video Voice commands encoded in ultrasonic waves can, best case scenario, silently activate a phone's digital assistant, and order it to do stuff like read out text messages and make phone calls, we're told.

The technique, known as SurfingAttack, was presented at the Network and Distributed Systems Security Symposium in California this week. In the video demo below, a handset placed on a table wakes up after the voice assistant is activated by inaudible ultrasonic waves. Silent commands transmitted via these pulses stealthily instruct the assistant to perform various tasks, such as taking a photo with the front facing camera, read out the handset's text messages, and making fraudulent calls to contacts.

It's basically a way to get up to mischief with Google Assistant or Apple's Siri on a nearby phone without the owner realizing it's you causing the shenanigans nor why it's happening – if, of course, they hear it wake up and start doing stuff. It's a neat trick that could be used to ruin someone's afternoon or snoop on them, or not work at all. There are caveats. It's just cool, OK.

Youtube Video

Eggheads at Michigan State University, University of Nebraska-Lincoln, and Washington University in St Louis in the US, and the Chinese Academy of Sciences, tested their SurfingAttack technique on 17 models of gadgets; 13 were Android devices with Google Assistant, and four were iPhones that had Apple’s Siri installed.

SurfingAttack successfully took control of 15 of the 17 smartphones. Only Huawei’s Mate 9 and Samsung’s Galaxy Note 10+ were immune to the technique.

“We want to raise awareness of such a threat,” said Ning Zhang, an assistant professor of computer science and engineering at St Louis, on Thursday. “I want everybody in the public to know this.”

Here’s one way to pull it off: a laptop, located in a separate room from the victim’s smartphone, connects to a waveform generator via Wi-Fi or Bluetooth. This generator is near the the victim's phone, perhaps on the same table, in the other room, and emits voice commands, crafted by the laptop, via ultrasonic waves. Technically, a circular piezoelectric disc placed underneath the table where the phone is resting emits the pulses from the generator.

The silent ultrasonic wave is propagated through the table to cause vibrations that are then picked up by the smartphone. The signals command the assistant on the phone to do things like “read my messages” or call a contact. A wiretapping device, also placed underneath the table, records the assistant and relays the audio back to the laptop to transcribe the response.

Tiny little small caveat: you'll need to imitate your victim's voice

So here’s a catch: to activate someone's smartphone, the attacker has to imitate or synthesize the victim’s voice.

Smartphone assistants are trained on their owners' voices so they won't respond to strangers. A miscreant has to find a way to craft realistic imitations of the victim’s voice, therefore. It’s not too difficult with some of the machine-learning technology out there already. However fiends will have to collect enough training samples of the victim’s voice for the AI to learn from. Qiben Yan, first author of the paper and an assistant professor of computer science at Michigan State University, told The Register the team used Lyrebird to mimic voices in their experiment.

Victims must have given Google Assistant or Siri permission to control their phones. The assistants can only perform a limited number of functions unless the user has already unlocked their phones. In other words, even if you can imitate a person, and send their device ultrasonic waves, the phone's assistant may not be able to do much damage at all anyway.

For example, if a target has not toggled their smartphone's settings to allow the digital assistant to automatically unlock the device, it’s unlikely SurfingAttack will work.

A glitchy computer screen

LCD pwn System: How to modulate screen brightness to covertly transmit data from an air-gapped computer... slowly

READ MORE

“We did it on metal. We did it on glass. We did it on wood,” Zhang said. Even when the device’s microphone was placed in different orientations on the tables, SurfingAttack was successful as well as when the circular piezoelectric disc and the wiretapping device were placed underneath a table with the phone 30 feet away.

The best way to defend yourself from these attacks is to turn off voice commands, or only allow assistants to work when a handheld is unlocked. Alternatively, placing your smartphone on fabric on a table would make it more difficult for the ultrasound signals to be transmitted.

Despite all these caveats, the academics reckoned SurfingAttack posed a serious potential threat. "We believe it is a very realistic attack," Yan told El Reg. "The signal waveform generator is the only equipment which is bulky. Once we replace it with a smartphone, the attack device can be portable.

"One great advantage of SurfingAttack is that the attack equipment is placed underneath the table, which makes the attack hard to discover. For synthesizing victims’ voice, we have to capture victims’ voice recording. However, if we want to target a specific user, it doesn’t seem to have any problem in capturing the users’ voice commands or synthesizing them after recording the victims’ voice.

"Moreover, the Google Assistant is not very accurate in matching a specific human voice. We found that many smartphones’ Google Assistants could be activated and controlled by random people’s voices. Also, many people left their phones unattended on the table, which creates opportunity for the attackers to send voice commands to control their devices." ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like