Watchdog 'enables Tesla Autopilot' with string, some weight, a seat belt ... and no actual human at the wheel

'The system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all'


Tesla vehicles can be tricked into driving in Autopilot mode when no one is sitting on the driver seat nor controlling the steering wheel – as required by the software – according to Consumer Reports.

Engineers at the organisation, a non-profit focused on testing commercial products, said that buckling the safety belt and dangling a small weight from a Tesla Model Y's steering wheel is enough to fool it into believing a driver is present, and thus allow the super-cruise-control to activate and take over the car.

With the weight installed and belt buckled, the engineers said they were able to cruise around a private track using Autopilot without anyone sitting in the driver seat.

Autopilot was turned on before one member of the team slowed the car to a stop and hopped onto the passenger side. The vehicle's speed was adjusted with Autopilot enabled, and it continued to drive even though nobody was behind the wheel, which shouldn't normally be allowed.

Self-driving systems from BMW, Ford, GM, Subaru, and others use a camera to make sure the driver is always behind the wheel.

Word of the demonstration comes after a deadly crash in Texas last weekend involving a 2019 Tesla Model S. Police officers investigating the incident believe neither of the two victims in the vehicle were driving the car at the time. The vehicle smashed into a tree then burst into flames.

Tesla model S

'There was no one driving that vehicle': Texas cops suspect Autopilot involved after two men killed in Tesla crash

READ MORE

US federal agencies, the National Highway Traffic Safety Administration (NHSTA), and the National Transportation Safety Board (NTSB) have launched separate probes into the collision, according to Dallas News .

Tesla supremo Elon Musk, however, claimed data logs from the vehicle showed neither Autopilot nor the automaker's Full Self-Driving mode was engaged. He also doubted Autopilot could drive on the road where the accident occurred.

“Standard Autopilot would require lane lines to turn on, which this street did not have,” he tweeted last week.

Tesla cars have various safety features that should, in theory, keep drivers safe when a vehicle is operating in Autopilot mode. Among those measures are a requirement for a driver's hands to stay on the steering wheel every ten seconds. Failure to do so disables self-driving features.

“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” said Jake Fisher, senior director of auto testing, who conducted the experiment at Consumer Reports.

“Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road,” he added.

“Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger,” Fisher concluded.

The Register has asked Tesla, the NHTSA, and the NTSB for comment, and we'll let you know if they get back to us. ®


Biting the hand that feeds IT © 1998–2021