This article is more than 1 year old

NHTSA upgrades Tesla Autopilot probe, could lead to recall

Up for debate is whether Autopilot, or humans behaving badly, is the reason for Tesla's iffy safety record

An investigation into the safety of Tesla's so-called Autopilot has been upgraded from a preliminary peek to a formal engineering analysis, a step that could put the Musk-owned motor company on the path to a recall of nearly one million vehicles. 

The investigation, being conducted by the US National Highway Traffic Safety Administration (NHTSA), began last year following a series of crashes in which a Tesla with Autopilot engaged crashed into other vehicles on the road or with roadside emergency vehicles responding to other accidents.

The NHTSA's investigation is limited to 2014-2022 Tesla Y, X, S and 3 vehicles, of which it estimates 830,000 have shipped.

In the course of its preliminary investigation, the NHTSA said it found reasons "to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver's supervision."

Put another way, the agency is concerned that Autopilot is making people behave badly behind the wheel. Drivers are supposed to be alert and able to take control at any moment, though with a branding like Autopilot and boasts of its features, Tesla owners may have been given the wrong idea about how to use the thing.

In all, 16 accidents were investigated as part of the probe, and 106 additional wrecks that followed the same pattern but didn't involve emergency vehicles, were considered. "In approximately half of the … 106 crashes, indications existed that the driver was insufficiently responsive," the NHTSA said in its letter reporting [PDF] the upgrade. 

Additionally, a quarter of the accidents involved Autopilot being used in an area where it wasn't designed to be used, such as a surface street or in a low-visibility environment. Drivers also apparently had their hands on the wheel in 86 percent of the cases for which that data was available.

NHTSA's preliminary evaluations are largely reviews of complaints and manufacturer documents and records, and upon conclusion are either closed or elevated to engineering analyses. In the latter case, "an engineering analysis (EA) is undertaken if data from a preliminary evaluation indicate further examination of a potential safety defect is warranted. The results of an EA determine whether a safety recall should be initiated or the investigation should be closed," the NHTSA said. 

The watchdog didn't say whether Autopilot was specifically at fault, and the EA will likely be trying to determine whether driver attentiveness or Autopilot programming was actually the result of the accidents. According to its preliminary study, forward collision warnings activated in the majority of incidents, while automatic emergency braking only activated in half the collisions examined.

"On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," the NHTSA said.

That's not giving human drivers relying too much on Autopilot much of a chance to avoid a prang.

In 2020, a Tesla Model X plowed into two parked police vehicles investigating an additional car, resulting in serious injuries to five human officers and a canine cop named Kodiak. The officers sued Tesla, claiming Autopilot is defective and Tesla vehicles unsafe. 

An additional Tesla accident earlier this year resulted in the first-ever US case of an individual being charged with vehicular manslaughter when their Model S went through an intersection with Autopilot engaged, striking a Honda Civic and killing two people. 

Tesla, which disbanded its PR department in 2020, has not responded to a request for comment. ®

More about

TIP US OFF

Send us news


Other stories you might like