This article is more than 1 year old

Researchers blind autonomous cars by tricking LIDAR

As I was on the motorway, I saw a man who wasn't there. Then things went pear-shaped

If you've ever been dazzled by some idiot's high-beam driving towards you at night, you'd probably welcome a self-driving car – except one of the key “eyes”, LIDAR, can also be blinded, or tricked into reacting to objects that aren't there.

LIDAR - Light Detection and Ranging - is an important self-driving vehicle technology: it gathers distances to objects by firing a pulsed laser at them and collating the reflections.

Hocheol Shin, Dohyun Kim, Yujin Kwon, and Yongdae Kim of the Korea Advanced Institute of Science and Technology have demonstrated two kinds of attacks against LIDAR: a spoofing attack, and a saturation attack. Their work is published at the International Association for Cryptologic Research's pre-print archive here.

While their work was in a lab, they write that the potential damage from an attack is serious.

“As per the data from UK Department for Transport, 55m is the braking distance for a car driving at 60mph. Because the braking distance is the distance required solely for braking, even autonomous vehicles have no room for checking the authenticity of the observed dots, but need to immediately activate emergency braking or evasive manoeuvres. Such sudden actions are sufficient to endanger the surrounding vehicles.”

The subject for their proof-of-concept attacks was the Velodyne VLP-16 sensor.

The saturation attack is very straightforward: “By illuminating the LIDAR with a strong light of the same wavelength as that the LIDAR uses, we can actually erase the existing objects in the sensed output of the LIDAR.”

The spoofing attack was more complex: the four researchers not only gave the LIDAR an optical illusion, they made it appear closer than the device creating the illusion.

To do this, the attackers exploited two characteristics of LIDAR, one of them intrinsic to the technology, the other specific to the implementation.

Basic illustration of LIDAR operation

Simplified illustration of how LIDAR operation. Image: IACR paper

Rather than capturing whole objects (as a camera does), LIDAR captures a point cloud sufficient to infer that an object is in its view (the car's computers then decide what action to take, if any). To spoof an object, the attackers only need to make the sensors respond to points of light that look like the point cloud of an object.

Using refraction to trick the LIDAR

Exploiting refraction. Image: IACR paper

If the sensor only responded in a single direction (say, straight ahead), spoofing isn't much an attack, since you'd have to put your attack device in the path of the vehicle.

That's where the implementation comes in: the researchers noticed that the Velodyne LIDAR (and many similar devices) protect their sensors with curved glass. A laser generating a point cloud at an angle can exploit refraction to change the “apparent” direction and distance the point cloud lies in.

“Fake dots in directions other than the direction of the attacker can be a severe threat to the victim, because the detected points have different significances according to their directions on roads”, they write.

The researchers demonstrated a second spoofing attack: they captured the laser pulse emitted by a LIDAR, added a bit of delay, and sent back a corresponding pulse using their own laser.

The paper also points out the difficulty of defending systems against these attacks: adding technology to authenticate the perceived dots, for example, could slow things down too much in an autonomous vehicle. ®

More about

TIP US OFF

Send us news


Other stories you might like