This article is more than 1 year old

Tesla's self-driving code may ignore stop signs, act unsafe. Patch coming ... soon

Musk takes issue again with 'recall' given this will be an over-the-air update

The US National Highway Traffic Safety Administration (NHTSA) has sent Tesla a letter in which it acknowledges Tesla will conduct a recall of the Full Self Driving Beta (FSD Beta) software in up to 362,758 cars, as the software is unsafe.

The issue [PDF] affects the Model S, Model X, Model 3, and Model Y vehicles, some dating back to 2016.

The NHTSA asserts the software's bugs mean it "may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.

"In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver's adjustment of the vehicle's speed to exceed posted speed limits."

A recall notice [PDF] issued by the NHTSA explains it notified the automaker about potential concerns in late January. On February 7 Tesla and NHTSA agreed the solution was a recall, although it was noted Tesla did so out of an "abundance of caution" while "not concurring with the agency's analysis."

On February 14, Tesla identified 18 warranty claims that could be related to the software issue. The Elon Musk-led giant stated it was not aware of any injuries or deaths associated with FSD Beta.

Tesla's response to the regulator's concerns is to issue an over-the-air software update in mid-April - making this a very modern recall compared to more common incidents of this nature that require visits to a service center.

Musk has therefore taken issue with the terminology used to describe the situation. On Thursday he tweeted:

Maybe he has a point. But surely the real issue here is that the software is dangerous?

Tesla has previously been subject to an investigation of its Autopilot feature since October. Authorities want to know if Tesla misled consumers about the safety and capability of FSD and Autopilot.

The automaker declared in its annual 10-k form filed in late January that it had handed over documents related to the probe to the Department of Justice.

The form also stated "no government agency in any ongoing investigation has concluded that any wrongdoing occurred."

Data released by the NHTSA in June showed Tesla vehicles motors were involved in over 70 percent of accidents with cars equipped with advanced driver assistance systems. ®

More about

TIP US OFF

Send us news


Other stories you might like