This article is more than 1 year old

Researchers trick Tesla into massively breaking the speed limit by sticking a 2-inch piece of electrical tape on a sign

You'd hope it would know 85mph speed limits aren't exactly routine

Vid A single piece of electrical tape stuck to a 35mph (56kph) road sign is enough to trick the autopilot software in Tesla's vehicles into speeding up to 85mph (136kph).

The vulnerability was reported by McAfee Labs, the security research arm of McAfee, on Wednesday. Steve Povolny, head of McAfee Advanced Threat Research, and Shivangee Trivedi, a data scientist working on the same team, discovered the attack when they probed the camera system aboard Tesla's Model X and Model S vehicles, both built in 2016.

Both cars use a camera containing the EyeQ3 chip from MobilEye, a computer vision company based in Israel and owned by Intel, to survey the vehicle's surroundings. Images are then fed as input to machine-learning algorithms to detect things like lane markings and signs so that Tesla's autopilot software can automatically take over the steering to change lanes and keep up with the speed limit, if the driver sets that up.

When the researchers placed a bit of black electrical tape measuring about two inches (5cm) long on a road sign depicting a 35mph (56kph) speed limit, the MobilEye camera in a Model X misread the sign as 85mph (136kph) and began speeding up accordingly.

tesla_adversarial_example

The two-inch long tape is placed onto the number three. This type of attack is described as Type A in the results below.

In the demo below, the driver takes over the autopilot and begins braking when the car reaches about 50mph (80kph).

Youtube Video

The bug only affects older Tesla vehicles that carry the MobilEye EyeQ3 camera system, described as Hardware Pack 1. It also only works if the car supports Traffic Aware Cruise Control (TACC) in Tesla's autopilot software. The TACC feature allows the car's speed to be manipulated by road signs detected by the camera mounted on its windshield.

"We have repeated the testing numerous times – though it is not quite 100 per cent reliable in getting the misclassification while the vehicle is in motion. Once misclassified, the TACC feature is 100 per cent reliable in setting the incorrect target speed," a McAfee spokesperson told The Register.

The researchers attempted to fool Tesla's cameras into misclassifying the 35mph road sign by placing electrical tape on the sign in different ways. They tested a Model X 125 times on various sticker styles over four days.

The two-inch long tape on the number three is described as "Type A" in the results. Type A was tested 43 times, and the camera mistook the 35mph sign as an 85mph one 25 times – so it was successfully tricked about 58 per cent of the time.

Pesky adversarial examples

The altered road sign is considered an adversarial example in the AI biz. To craft adversarial examples that consistently trick machine-learning algorithms, the researchers had to first experiment with an image classifier.

First, they attacked the image classifier with various adversarial examples to find the best ones. Since the researchers have full access to how the image classifier works, the process is described as a "white-box" attack. Next, they fine-tuned and transferred the attack on to Tesla's cameras without having detailed knowledge on how its image recognition algorithm works, known as a "black-box" attack.

"What this means, in its most simple form, is attacks leveraging model hacking which are trained and executed against white box, also known as open source systems, will successfully transfer to black box, or fully closed and proprietary systems, so long as the features and properties of the attack are similar enough," they said.

Although it's alarming that a piece of black tape can fool a Tesla car into automatically speeding up, the potential dangers are probably pretty limited. The driver has to engage the TACC feature after the cameras have been tricked by physically double-tapping the car's autopilot lever, the researchers told El Reg.

"The driver would probably realize the car was accelerating quickly and engage the brakes or disable TACC, and other features, such as collision avoidance and possibly following distance could mitigate the possibility of a crash.

"The research was performed to illustrate the issues that vendors and consumers need to be aware of and to facilitate the development of safer products."

AI_person_recognition

Boffins don bad 1980s fashion to avoid being detected by object-recognizing AI cameras

READ MORE

MobilEye's EyeQ3 camera system is deployed in over 40 million vehicles, including cars from Cadillac, Nissan, Audi and Volvo.

The researchers only tested MobilEye's cameras and the autopilot software in older Tesla Model S and X vehicles. Newer vehicles do not contain MobilEye cameras; Tesla ended its partnership with the Israeli biz in 2016.

It's not the first time that adversarial examples have managed to trick Tesla cars in real life. Last year, another group of researchers from Tencent showed that its cars could be forced to swerve across lanes by placing stickers on the road.

"We contacted and disclosed the findings to both MobilEye and Tesla in late September of 2019 – both expressed interest, and satisfaction with the research, but did not indicate any plans to address the models deployed in the field," the McAfee Labs researchers told us. We have contacted Tesla for comment. ®

More about

TIP US OFF

Send us news


Other stories you might like