This article is more than 1 year old

Poltergeist attack could leave autonomous vehicles blind to obstacles – or haunt them with new ones

First 'AMpLe' concept proves worryingly simple to implement with success

Researchers at the Ubiquitous System Security Lab of Zhejiang University and the University of Michigan's Security and Privacy Research Group say they've found a way to blind autonomous vehicles to obstacles using simple audio signals.

"Autonomous vehicles increasingly exploit computer-vision based object detection systems to perceive environments and make critical driving decisions," they explained in the abstract to a newly released paper. "To increase the quality of images, image stabilisers with inertial sensors are added to alleviate image blurring caused by camera jitter.

"However, such a trend opens a new attack surface. This paper identifies a system-level vulnerability resulting from the combination of the emerging image stabiliser hardware susceptible to acoustic manipulation and the object detection algorithms subject to adversarial examples."

To try to prove their point, the team came up with Poltergeist: an attack against camera-based computer-vision systems, as found in autonomous vehicles, which uses audio to trigger the image stabilisation functions of the camera sensor and blur the image – tricking the machine learning system into ignoring obstacles in its way.

"The blur caused by unnecessary motion compensation can change the outline, the size, and even the colour of an existing object or an image region without any objects," the team found, "which may lead to hiding, altering an existing object, or creating a non-existing object." The team categorised these in turn as Hiding Attacks (HA), Creating Attacks (CA), and Altering Attacks (AA).

It's the first example of what the researchers have claimed as a new class of attack: AMpLe, a somewhat clunky shuffled backronym for "injecting physics into adversarial machine learning."

In simulation, Poltergeist showed a 100 per cent success rate for hiding, 87.9 percent for creating, and 95.1 percent for altering objects, when trialled against the YOLO V3/V4/V5 and Fast R-CNN object detection networks plus a commercial YOLO 3D implementation used in Baidu's Apollo robo-taxis.

To prove the concept out of the lab, a Samsung S20 smartphone was attached to a moving vehicle and an actual attack carried out. While object creation and alteration proved considerably more difficult than the simulations had suggested, at a 43.7 per cent and 43.1 per cent success rate respectively, hiding objects was easy with a worrying 98.3 per cent success rate, the researcher said.

"PG [Poltergeist] attacks are robust," the team found, "across various scenes, weathers, time periods of a day, and camera resolutions."

The team stopped short, however, of actively attacking a real-world autonomous vehicle. "While it's clear that there exist pathways to cause computer vision systems to fail with acoustic injection," the researchers concluded, "it's not clear what products today are at risk. Rather than focus on today's nascent autonomous vehicle technology, we model the limits in simulation to understand how to better prevent future yet unimagined autonomous vehicles from being susceptible to acoustic attacks on image stabilisation systems."

The concept doesn't stop at audio signals, either. "Future AMpLe attacks could leverage signal transmission via ultrasound, visible light, infrared, lasers, radio, magnetic fields, heat, fluid, etc. to manipulate sensor outputs and thus the subsequent machine learning processes (e.g., voice recognition, computer vision)," the researchers warned.

"AMpLe attacks could cause incorrect, automated decisions with life-critical consequences for closed loop feedback systems (e.g., medical devices, autonomous vehicles, factory floors, IoT [Internet of Things])."

More information is available on the project's GitHub repository, while a PDF of the paper can be downloaded under open-access terms from here. ®

More about

TIP US OFF

Send us news


Other stories you might like