Musk tried to wriggle out of Autopilot grilling by claiming past boasts may be deepfakes

Tycoon hopes to swerve deposition in Tesla death crash lawsuit

Tesla CEO Elon Musk may face a deposition in a civil lawsuit over a death allegedly caused by Autopilot – after a judge rejected arguments made by the billionaire's lawyers.

The legal eagles attempted [PDF] to convince Judge Evette Pennypacker that lawyers representing the family of slain Tesla owner Walter Huang shouldn't be allowed to grill the Twitter tycoon because he can't recall what he's believed to have said about Autopilot's capabilities, and that whatever the family says he said in the past – even recordings of him speaking – may have been a deepfake generated by AI software.

Yes, Musk's lawyers tried to argue the SpaceX supremo shouldn't be deposed and quizzed about things he can't remember or didn't record saying because those statements were potentially invented by someone else using machine-learning tools. Hopefully not everyone tries that line from now on. Forget what I said, I may not even have actually said it, a computer did.

"The reality is that [Musk], like many public figures, is the subject of many 'deepfake' videos and audio recordings that purport to show in saying and doing things he never actually said or did," Musk's lawyers said in their opposition to the plaintiffs' motion to compel Elon's testimony.

Musk could be forced to testify to the truth of statements he is said to have made in the past, such as a 2016 claim cited by the plaintiffs that Autopilot "can drive autonomously with greater safety than a person. Right now." The tech mogul meanwhile argued that anything he said in the past that he doesn't have a copy of – such as that 2016 boast about Autopilot – could have been deepfaked, and so should be discarded.

"Mr Musk confirmed that he did not independently record the discussions or maintain a copy of the original recordings, did not take notes, and cannot specifically recall the details about the discussions or statements," Musk's lawyers continued.

"Therefore, even after consulting directly with Mr Musk, Tesla remains unable to admit or deny whether the recordings are authentic."

Judge Pennypacker wasn't convinced by the Tesla lawyers' arguments, reportedly saying their line of reasoning was "deeply troubling." 

"[Tesla lawyers'] position is that because Mr Musk is famous and might be more of a target for deep fakes, his public statements are immune," Judge Pennypacker wrote. She added that such arguments could allow Musk and other famous people to avoid taking responsibility for anything they say or do.

Whether Musk's lawyers bothered to consider that their arguments would make the judge more interested in getting Musk to talk under oath is unknown. We've reached out to Tesla's lawyers to ask that and other questions, and haven't heard back. 

Musk may yet be able to avoid giving testimony if further legal arguments are more convincing.

Huang, a 38-year-old Apple engineer, was killed in 2018 when his Tesla Model X veered off a freeway exit ramp in California, accelerated, and slammed into a barricade at over 70 miles per hour. The vehicle was in Autopilot mode leading up to the crash and showed no signs of responding to the barricade. 

Tesla lawyers have tried to pass blame for the accident onto Huang, who they pointed out was playing video games at the time of the accident and claimed had ignored multiple warnings from the Autopilot system to put his hands back on the wheel.

You're supposed to keep your hands on the wheel when using Autopilot – an AI-supported super-cruise-control rather than a proper full self-driving system – which Huang seemingly did not do. There is concern that Tesla owners may have been given the false impression that Autopilot can and should be expected to function safely all by itself without human attention.

A hearing today will determine whether Musk will actually have to give testimony; Pennypacker's ruling ordering the deposition is tentative pending Thursday's proceedings. As Reuters pointed out, judges in California often issue such tentative rulings, which are almost always finalized with few changes after a hearing.

We already know some Autopilot boasts are fake

Last June, Tesla's director of Autopilot software Ashok Elluswamy was deposed in the Huang case and his testimony pointed to at least one major example of Tesla misrepresenting Autopilot's safety, in a four-minute video released in 2016.

The video, which claims to show a Tesla navigating city streets and parking without human intervention, was staged, Elluswamy said, to demonstrate "what was possible," not "what was then the state of Autopilot."

Musk made no mention of the faked nature of the video when he shared it in a 2016 tweet, and Tesla's web page for the video calls the video a display of "full self-driving hardware on all Teslas."

Elluswamy testified he was in the back seat during recording of the video, that it took multiple takes to get everything right, and that the route the vehicle traveled was planned and 3D mapped in advance. 

Regardless of whether Musk is deposed, the case is scheduled to go to trial on July 31. 

Tesla recently won a separate Autopilot-related lawsuit in California, with a jury finding that plaintiff Justine Hsu's injuries as a result of her Tesla crashing while in Autopilot mode weren't the automaker's fault.

Hsu's lawyers also tried suing Tesla for misrepresentation of Autopilot's capabilities, though the jury decided Hsu had ignored the warnings from Autopilot to put her hands back on the wheel. The result of Hsu's trial could come directly into play as Huang's case heads toward the courtroom.

Tesla is also facing investigations from America's National Highway Traffic Safety Administration and the Department of Justice regarding Autopilot safety. It's already been forced to recall Autopilot code over safety issues, while the DoJ is reportedly in the midst of a criminal investigation into the hype surrounding the Tesla feature. ®

More about

TIP US OFF

Send us news


Other stories you might like