This article is more than 1 year old

US watchdog opens probe into Tesla's Autopilot driver assist system after spate of crashes

Lights, cones, illuminated arrows all involved, say investigators

A US government agency has formally opened a probe into Tesla's so-called Autopilot system following a spate of well-publicised crashes over the past few years.

The investigation covers over three-quarters of a million vehicles, which has got to be a decent chunk of the US inventory shifted by Tesla since the start of the 2014 model year. It is estimated that in past three years alone, Tesla has sold a combined 430,592 units of Model X, Model S, and Model 3 in the United States.

The National Highway Traffic Safety Administration's Office of Defects Investigation (ODI) said it was looking into the "level 2" Advanced Driver Assistance System (ADAS), which it said consists of lane-keeping and cruise control functions.

"Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones," the agency said in a statement [PDF] today. "The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes."

Of the 11 crashes highlighted by the ODI, four took place this year. Its evaluation of the Autopilot system will focus on technology installed in Tesla Models S, Y, X and 3 between 2014 and the present day. The ODI potentially has the power to order a mass recall [PDF] of the affected cars if it believes they are unsafe. The manufacturer then needs to offer consumers a remedy.

Many Tesla crashes have involved the use of the Autopilot driver assistance function, which is more super-cruise-control than a fully autonomous driving capability. In 2019, a smash in Florida occurred just 10 seconds after a driver enabled Autopilot. The car's systems failed to detect a white lorry across a junction in front of it.

Also that year, a Model 3 driven on Autopilot crashed into a police car that had pulled up behind a broken-down vehicle. The driver said he was "checking on his dog" at the time. Autopilot is supposed to be functional only if the driver has their hands on the steering wheel. The tech was criticised by the US National Transportation Safety Board in a 2020 report which found that "system limitations" in Autopilot and the victim being overly trusting of the car's Autopilot software were to blame for a fatal crash which killed an Apple engineer. Tesla refused to describe to investigators how the system operated.

The Californian Department of Motor Vehicles, a licensing agency, said in May that Tesla chief exec Elon Musk overstated the Autopilot system's capabilities. Tests from a US-based consumer rights organisation in April were claimed to show that Autopilot could be enabled by fastening the driver-side seatbelt and hanging a weight off the steering wheel, bypassing safety features intended to ensure a human at the wheel is paying some attention to the road ahead.

The SAE automotive automation levels system measures the degree of automation a self-driving system provides. It has five levels, ranging between 1 (driver assistance) and level 5 (full automation requiring no human monitoring or intervention). ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like