Tesla, Musk likely aware of Autopilot deficiencies behind Florida fatality, says judge
Decision means family of man killed in 2019 cross-traffic collision can take carmaker to trial
There is "reasonable evidence" that Tesla and Elon Musk were aware of deficiencies in Autopilot that caused the 2019 death of a Model 3 owner, a Florida judge has ruled, opening the way for yet another liability trial.
Judge Reid Scott from the Circuit Court for Palm Beach County ruled [PDF] last week that the evidence previously presented in the lead-up to a civil trial over Tesla's liability for the death of Jeremy Banner suggests Tesla and Musk knew Autopilot had a poor ability to detect cross traffic. But despite this alleged knowledge, nothing was done to fix the system after a 2016 fatality of Joshua Brown that preceded Banner's 2019 death.
Autopilot was engaged in both Banner and Brown's accidents, which occurred when both they and their Teslas failed to notice cross traffic. Brown and Banner passed under semi truck trailers traveling across their paths, which sheared the tops of their vehicles off.
The Register previously covered the Banner case after it emerged over the summer that two Tesla Autopilot engineers deposed in the case had testified that Autopilot was released without the ability to recognize cross traffic, and told the court Autopilot was only designed for use on highways with center dividers because "it was technically a 'very hard thing' for the hardware and software to account for cross traffic," according to one engineer. The engineer also said that Autopilot could be engaged and operated without a center divider present.
The engineers' testimony was a key component of the Banner family's case, and was part of a motion by them to add a claim for punitive damages to the original complaint they filed against Tesla. According to the judge, the engineer testimony and additional evidence is enough that a jury could find Tesla guilty of both intentional misconduct and gross negligence.
"It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers were acutely aware of the problem with 'Autopilot' failing to detect cross traffic," Judge Scott said in his ruling.
Furthermore, "a reasonable person could find that [Tesla's] conduct was so reckless and wanting of care that it constituted a conscious disregard or indifference to the life, safety or rights of person [sic] exposed to such conduct," Scott opined.
Tesla has managed to avoid liability in two prior cases involving accidents that plaintiffs attributed to Autopilot failures this year, but the Banner case is decidedly different.
- No more Mr Nice DoJ: Tesla gets subpoenas over self-driving software claims
- Tesla's purported hands-free 'Elon mode' raises regulator's blood pressure
- Tesla's Autopilot boasts, safety probed by California AG
- Tesla's self-driving code may ignore stop signs, act unsafe. Patch coming ... soon
In April, a California jury sided with Tesla over an accident that injured Los Angeles resident Justine Hsu, whose Tesla in Autopilot mode swerved onto a curb, causing her airbag to deploy with enough force to reportedly break her jaw and knock out multiple teeth.
Hsu was awarded zero damages, with the jury finding Tesla had done everything right in its disclosure of Autopilot's functionality.
A second case decided last month involved the 2019 death of Tesla owner Micah Lee, whose Model 3 veered off an LA highway, hit a tree, and burst into flames. Lee was killed, and his two passengers critically injured. Tesla argued that Lee had consumed alcohol prior to the crash and it wasn't clear whether Autopilot was actually engaged.
Again, Tesla was found not liable for Lee's death.
Whether Tesla will be found liable in Banner's case will largely hinge on the testimony of those Autopilot engineers and Tesla's alleged prior knowledge of the flaw documented in Brown's death.
We have asked Tesla to comment. ®
A date has yet to be set for the Banner trial, which is Palm Beach County Circuit Court Case No. 50-2019-CA-009962.