Tesla Faces Engineer Claims In Autopilot Fatal Crash Case

tesla

Tesla faces claims by multiple company engineers as 2019 Autopilot-linked fatal crash in Florida heads toward trial

Multiple Tesla engineers have testified that the company made no changes to its Autopilot driver-assistance feature to account for limitations that contributed to a fatal 2016 crash, before a similar fatal accident in 2019.

The testimony was excerpted in recent filings in a case brought by the family of the man who died in the 2019 crash, Jeremy Banner, a 50-year-old father of three, Bloomberg reported.

The Banner case is likely to go to a jury trial in October, in a first for a fatal accident blamed on Autopilot.

In the 2016 incident Florida resident Joshua Brown drove into the side of a truck that was crossing the road. The death of Banner, also a Florida resident, was very similar.

Image credit: Tesla

‘Not designed to detect that’

In 2021 testimony Tesla engineer Chris Payne said that although the company knew “that there’s cross traffic or potential for cross traffic, the Autopilot at the time was not designed to detect that”.

Engineer Nicklas Gustafsson made similar remarks in a 2021 deposition.

Banner’s widow earlier this month revised her complaint to seek punitive damages, arguing Tesla should have modified Autopilot to simply switch off in dangerous circumstances following Brown’s 2016 death.

“There is evidence in the record that the defendant Tesla engaged in intentional misconduct and/or gross negligence for selling a vehicle with an Autopilot system which Tesla knew to be defective and knew to have caused a prior fatal accident,” the revised complaint reads.

‘Same defect’

Banner family lawyer Trey Lytal told Bloomberg Tesla had allowed the “same defect” to cause a second death three years after the first.

Lytal said the company “not only knew of this defect, but was warned by regulators for the US government that the system should not be used on roads with cross traffic”.

Tesla says Autopilot is intended for use on highways and limited-access roads, although the system is not disabled in other environments.

It says drivers are repeatedly informed that they must remain alert and ready to take over from Autopilot at a moment’s notice. The drivers involved in accidents were not paying attention, the company has said.

Ambiguous message

But critics say Tesla has lulled drivers into a false sense of security with public statements about Autopilot’s capabilities and with a product design that allows their attention to wander while the system controls the vehicle.

In April Tesla attracted scorn for briefly arguing in a separate case that statements about Autopilot by chief executive Elon Musk might be deepfakes, a tactic judge Evette Pennypacker called “deeply troubling”.

Earlier this year the firm won its first jury trial for a non-fatal Autopilot-linked accident, in a case in which a woman said the feature caused her Model S to suddenly veer into the centre median of a Los Angeles street.