This article is more than 1 year old

Tesla Full Self-Driving 'fails' to notice child-sized objects in testing

Campaign group's study uses tiny sample size, test done without hands on the wheel

Updated The latest version of Tesla's Full Self-Driving (FSD) beta has a bit of a kink: it doesn't notice child-sized objects in its path, a campaign group has claimed.

In tests performed by The Dawn Project using a Tesla Model 3 equipped with FSD version 10.12.2 (the latest, released June 1), the vehicle was given 120 yards (110 metres) of straight track between two rows of cones with a child-sized mannequin at the end.

The group said the "test driver's hands were never on the wheel." Crucially, Tesla says even FSD is not a fully autonomous system: it's a super-cruise-control program with various features, such as auto lane changing and automated steering. You're supposed to keep your hands on the wheel and be able to take over at any time.

Traveling at approximately 25mph (about 40kph), the Tesla hit the dummy each time during the group's experiments.

Of the results, the Dawn Project said 100 yards of distance is more than enough for a human driver to notice a child, stating: "Tesla's Full Self-Driving software fails this simple and safety critical test repeatedly, with potentially deadly results."

"Elon Musk says Tesla's Full Self-Driving software is 'amazing.' It's not… This is the worst commercial software I've ever seen," said project founder Dan O'Dowd in a video he tweeted out along with the results.

O'Dowd, who also founded Green Hills Software in 1982 and advocates for software safety, has been an opponent of Tesla for some time, even launching a bid for a US Senate seat in California that centered on policing Tesla as a way to talk about broader reliable engineering issues. O'Dowd's senate bid ended in June when he lost the Democratic party primary. 

The Dawn Project's stated goal is "making computers safe for humanity." Tesla FSD is the project's first campaign. 

Tiny sample size

It's worth noting that The Dawn Project's tests of FSD 10.12.2, which took place on June 21 in Rosamond, California, seemingly only consisted of three runs. That's a very small sample size, though considering other Tesla tests and statistics, it's in a way not unexpected. 

Malfunctions in Autopilot – Tesla's suite of software that includes regular Autopilot as well as FSD – have been cited as allegedly being a factor in several, fatal accidents involving both drivers and pedestrians over the years. Last year Tesla rolled back FSD software releases after software bugs were discovered that caused troubles with left turns, something Tesla is still working on

In early June, the US National Highway Traffic Safety Administration upgraded a probe of Autopilot to determine whether the technology "and associated Tesla systems may exacerbate human factors or behavioral safety risks." Ie, does Autopilot encourage people to drive badly. That investigation is ongoing.

A week after announcing its probe, the NHTSA said Tesla Autopilot operating at level 2 autonomy was involved in 270 of the 394 driver-assist accidents – around 70 percent – it cataloged as part of an investigation into the safety of driver-assist technology.

Most recently, the California Department of Motor Vehicles filed complaints against Tesla alleging the biz misrepresented claims the vehicles can drive autonomously. If Tesla doesn't respond to the DMV's claims by the end of this week, the case will be settled by default and could lead to the automaker losing its license to sell cars in California.

The Dawn Project said that the NHTSA has acted quickly to issue recalls on Tesla features, pointing to Tesla's NHTSA-spurred recalls of FSD code that allowed Teslas to roll past stop signs and the disabling of Tesla Boombox

The Dawn Project says its research "is far more serious and urgent." ®

Updated to add

After our initial report, concerns were raised in various places online that the tests performed by The Dawn Project weren't legitimate. We've reached out to the group to address the criticisms against it, and we have yet to hear back. One of the primary issues is whether FSD or the Autopilot suite was actually engaged during the tests. The Dawn Project has released raw footage [MP4] from its experiments for you to see for yourself.

Bear in mind, previous studies have shown Autopilot may automatically disengage just before it detects there will be a collision.

According to the owner manual for the Tesla Model 3, the vehicle used in the tests, when Autopilot functionality, such as Autosteer and Navigate on Autopilot, are engaged, the Autopilot icon in the upper-left of the Tesla's main display will be blue, and a single blue line will be shown in the path of the vehicle, both of which are visible in the raw footage.

What the video can't show, and what The Dawn Project didn't mention in its report, is whether Autopilot's automatic emergency braking was enabled on the vehicle. The Model 3 manual indicates it can be disabled. We’ve also asked The Dawn Project to clarify this.

According to the signed affidavit from the test driver The Dawn Project used, “during three (3) of the tests, the [vehicle] was put into full self-driving mode and it hit the mannequin three (3) times.” We’ve asked The Dawn Project for clarification on the number of runs that were performed, which features were enabled, and the results for each run.

In short: as we noted originally, the sample size was either super small or more than three test runs occurred but only three were documented, which should be taken into account; and the driver had their hands off the wheel when that's not how the technology should be used. Also, it's now not clear if automated braking was enabled, or whether whatever Autopilot features were enabled should have stopped the car anyway.

"FSD was engaged during the tests as the affidavit confirms and as the raw footage from inside the car also confirms," the researchers say. "The blue line and blue FSD symbol are visible on the Tesla screen. Additionally the raw footage of the testing has been available on our website."

What is clear is that it's not helpful that Tesla marketing separates regular Autopilot from FSD and has them both under the umbrella brand of just Autopilot, while the manual mixes the individual features together. So when people talk about FSD ad Autopilot, you're never quite sure which functionality is being relied upon or if it is active.

Take all of this, Tesla's point of view and the Dawn Project's report, with a suitable pinch of salt.

More about

TIP US OFF

Send us news


Other stories you might like