Tesla on the wrong tracks with Fail Self Driving, Senators worry
Full Self-Driving mode could be on-track to cause serious accidents at train crossings
A pair of US senators is asking the federal traffic safety agency to look into Tesla's self-driving software in response to complaints that it fails to stop for trains at railroad crossings.
The electric auto firm has had a checkered history with its Full Self-Driving (FSD) software, which is designed to assist with driving tasks on journeys (with an attentive driver behind the wheel today, though Tesla markets it as a step toward eventual full autonomy), but has experienced a number of mishaps.
Now, Senators Edward J. Markey (D-MA) and Richard Blumenthal (D-CT) have penned a letter [PDF] to the National Highway Traffic Safety Administration (NHTSA), asking it for a formal investigation into Tesla's system because of reported failures to safely detect and respond to railroad crossings.
"Despite past investigations by the NHTSA into Tesla's system, FSD reportedly continues to pose an ongoing threat to safety on our public roads. Because collisions between trains and cars often cause significant fatalities and injuries, FSD's failure to safely navigate railroad crossings creates serious risk of a catastrophic crash," the pair explains.
The issue is that a number of Tesla drivers have reported incidents in which their vehicle using FSD failed to recognize warning signs or even active crossing gates, requiring immediate human intervention to avoid a potentially fatal collision.
But you don't have to take their word for it. The Dawn Project, which has made a mission of exposing Tesla's faults, posted a video filmed in Ventura County, California, of one of the vehicles with FSD engaged failing to stop at a crossing. It ignores the red flashing lights and alarm bells and tries to drive straight across the tracks - with a train approaching – requiring the driver to hit the brakes.
The Dawn Project is an advocacy campaign founded by software entrepreneur Dan O'Dowd, who commented: "Yet more evidence that Tesla's 'Full Self-Driving' software will try to kill you."
This follows an earlier investigation by NHTSA last year, after a series of accidents in low-visibility conditions, and incidents such as a fatal collision with a motorcyclist in Washington State where a Tesla was operating in Autopilot mode.
Incidents such as these demonstrate both the limitations of Tesla's technology and confusion around the branding of FSD, the senators write in their letter.
"Although Tesla warns FSD users to always supervise their vehicles when FSD is engaged, consumers would understandably expect a system named Full Self-Driving to recognize something as rudimentary as the flashing signals indicating an oncoming train at a rail crossing, without the driver having to intervene to avoid disaster," they state, making the Full Self-Driving label both misleading and dangerous.
- Tesla bets on bot smoke screen as political and market realities bite
- Florida jury throws huge fine at Tesla in Autopilot crash
- Struggling to sell EVs, Tesla pivots to slinging burgers
- Musk's antics and distractions are backfiring as Tesla's car business stalls
Tesla has since renamed the feature Full Self-Driving (Supervised), they note.
The pair wants NHTSA to force Tesla to adopt a name for its automated driving system that does not mislead drivers about its capabilities, in addition to conducting a formal investigation into FSD's operations around rail crossings.
"As part of this investigation, the agency should consider clear and obvious actions to protect the public, including restricting Tesla's FSD to the road and weather conditions it was designed to operate in," the letter insists.
NHTSA's response to the reports concerning railroad crossings has so far been unsatisfactory, the senators state, with the agency doing nothing more than acknowledging it is aware of the incidents and stating it is in contact with the manufacturer.
We approached Tesla for comment and will update this article if we hear back. ®