Watch your mirrors: Tesla Cybertrucks have 'Full' 'Self Driving' now
As eggheads reckon Musk-mobiles need human interventions every 13 miles
Owners of Tesla's Cybertruck are reporting that a software update enabling the self-styled Full Self Driving (FSD) has become an option for their giant rolling wedges of stainless steel.
A post on the Cybertruck Owners Club Forum Sunday indicated that some lucky Cybertruck owners received an over-the-air software update labeled 2024.32.20, which included an early access build of FSD version 12.5.5. Multiple users reported receiving the update, and videos of purported FSD cruises in the Cybertruck have since appeared.
Unlike all the Tesla models that preceded it, the Cybertruck didn't ship with FSD or any other type of Autopilot technology, though buyers of the $99,990 vehicle were still able to pay for the feature with the promise that Tesla would deliver it in due course.
With the weekend release to early access invitees, the Cybertruck now has access to the newest version of FSD. Most Teslas are still running version 12.5.4, which only recently saw a general release. We've asked Tesla when FSD will be generally available for the Cybertruck, and haven't received a response from the Elon Musk biz at the time of writing.
Tesla FSD: Faulty and Seriously Deadly?
There's plenty to criticize about the Cybertruck, which has been found to rust, jam fingers, and have a flaw that can mean its accelerator becomes stuck.
Automotive safety experts have been especially critical of the vehicle's shape and excessive weight, which they've suggested make it unsafe for pedestrians, cyclists, and other motorists.
- Tesla Cybertruck gets cyberstuck during off-roading expedition
- California upgrade company aims militarized 'Tactical' Cybertruck at police forces
- Tesla recalls over 1.6M electric cars in China for faulty hood lock
- Tesla asks customers to stop being wet blankets about chargers
A recent report about all of Tesla's FSD wares offers another reason to worry about self-driving Cybertrucks.
That report came from automotive research firm AMCI Testing, which last week published research that it claimed is the "most extensive real-world test of Tesla's FSD ever conducted by an independent third party," covering more than 1,000 miles of real-world driving.
While Tesla FSD's performance was "impressive for a uniquely camera-based system … our drivers had to intervene over 75 times during the evaluation; an average of once every 13 miles," AMCI found.
The researchers described FSD's performance as being surprisingly capable, especially in the first few minutes of a drive.
"The confidence (and often, competence) with which [Tesla FSD] undertakes complex driving tasks lulls users into believing that it is a thinking machine – with its decisions and performance based on a sophisticated assessment of risk (and the user's wellbeing)," AMCI said.
But errors are frequent, the firm warned, and when they occur they're often "sudden, dramatic and dangerous."
"In those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident – or possibly a fatality," AMCI found.
That’s not hyperbole: Tesla's self-driving and Autopilot technologies have been linked to numerous fatal accidents in recent years. Several cases have been litigated, with Tesla settling one case and found not to be responsible for another accident.
Keep your hands on the wheel, in other words. Tesla does require drivers to remain in control while in FSD mode.
Tesla's Autopilot false advertising tussle with California DMV must go to trial
READ MOREThe automaker has had a tough year, with the EV maker’s earnings plummeting in the second quarter and its share of the US EV market dipping below 50 percent for the first time in its history.
The automaker has promised to reveal an autonomous robotaxi at an investor day scheduled for October 10 – after previous postponements.
AMCI observed that its findings lead it to suspect Tesla's autonomous driving capabilities may not be up to the task of safely operating a fleet of self-driving taxis, however.
"Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results," argued AMCI Global CEO David Stokols. "Although [Tesla FSD] positively impresses in some circumstances, you simply cannot reliably rely on the accuracy or reasoning behind its responses." ®