Researchers Find Inaudible Attack On Siri, Google Voice Assistants

Researchers Find Inaudible Attack On Siri, Google Voice Assistants

by , Staff Writer @lauriesullivan, March 2, 2020

Researchers Find Inaudible Attack On Siri, Google Voice Assistants | DeviceDaily.com

Researchers have demonstrated how voice assistants can be secretly activated to make phone calls, take photos, and read back text messages without ever physically touching the device.

A U.S. and Chinese university team that includes the University of Nebraska-Lincoin, SEIT Lab, and Washington University in St. Louis dubbed the hack SurfingAttack based on its ability to remotely control voice assistants using inaudible ultrasonic waves.

Technologies can significantly improve the living quality, but they also can change the landscape of cyber threats with the ability to launch inaudible ultrasonic attacks.

The researchers prepared a demo that targeted Siri, Google Assistant, and Bixby. These voice assistants are designed to respond when they detect trigger phrases such as “Ok, Google.”

SurfingAttack was discovered by five researchers from a variety of schools. The researchers tested the method on 17 smartphone models from Apple, Google, Samsung, Motorola, Xiaomi, and Huawei. They managed to successfully deploy SurfingAttack on 15 of the phones.

They manages to activate the voice assistants, commanding them to unlock devices, take repeated selfies, make fraudulent calls and get the phone to read aloud a user’s text messages, including SMS verification codes.

Responses were recorded using a concealed microphone after turning down the device’s volume so this communication would not be heard by a nearby user in an office setting.

Naked Security explains that the commands are simply sound waves. While other research has shown hacks carried out using ultrasonic waves that humans can’t hear, SurfingAttack can send the ultrasonic commands through a solid glass or wood table on which the smartphone sits by using a circular piezoelectric disc connected to its underside of the table.

In a video providing an example of fraud calling, SurfingAttack unlocks the victim’s phone. After connecting, a synthetic voice could be sent from one phone to another asking for an access code. In this case the access code to a lab. Sam answers. The code can be captured by the fraudster.

MediaPost.com: Search Marketing Daily

(32)