A Tesla Model 3 driver reportedly fell asleep with the car's misleadingly named "Autopilot" lane-keeping feature enabled – and promptly crashed into a pile of barrels.
"This accident was my fault. I fell asleep at the wheel. I wasn't sleepy prior to falling asleep or I would have done something about that. That's actually the scary part," posted YouTuber "Richard FS", having uploaded a dashcam video from what he claimed was his own car.
That video (you'll have to go to YouTube to watch it as he's disabled both comments and the embedding feature, and will probably delete it altogether once he realises what a colossal moron he's made himself look like) shows a car merrily ploughing into just under a dozen traffic cones on an American motorway.
The moment a Tesla on 'Autopilot' collides with a bunch of motorway traffic cones in America
"I stand by my position that yes, I failed to drive the car... but so did the computer," insisted Richard FS in his video's description, adding: "Tesla Model 3 with FSD option. Automatic Emergency Braking totally failed me on the one time I needed it most."
Not only that, his video also shows the car being driven past a flashing "change lane now" illuminated arrow next to the cones, which can be seen in the first few seconds of the video.
Circled in red, the flashing 'change lane now' arrow
He continued: "If it's so conservative on what constitutes a hazard then what about those 10 barrels? Yes, they're plastic and not picked up by radar but a small child isn't either, so it's relying on those three forward facing cameras. Why didn't it detect the barrels?"
Probably, Richard, because traffic cones are not small children playing in the road – and also because the Autopilot system is designed as a driver-assistance aid, despite its idiotic and misleading name. If you fall asleep, the car isn't going to safely drive itself.
Luckily nobody was injured and the only damage was to Richard's pride (and those barrels).
This is a rather big difference from the time another Tesla driver fell asleep with the driver-assistance suite running. In that December 2018 incident, quick-thinking police pulled in front of the car and gradually slowed to a stop, exploiting the software's programming to avoid rear-ending vehicles in front. Thankfully Tesla hasn't yet figured out how to automatically overtake.
Another Tesla driver – an Apple software engineer – was not so lucky when his car accelerated into a crash barrier last year, killing him and causing a multi-car pileup.
The Reg has asked Tesla for comment. ®