Tesla has been hit with a lawsuit in Texas, filed by five local police officers, who fiercely criticised the firm’s Autopilot driver assistance program.
The Register reported that both Tesla and a local restaurant have been sued, after an alleged drink-driver smashed his Model X into the back of two parked police cruisers.
The lawsuit forces Tesla’s Autopilot driving assistance system back into the spotlight, at a time when the US National Highway Traffic Safety Administration (NHTSA) is currently investigating multiple fatal crashes involving Tesla vehicles.
The Texas lawsuit, filed by five police officers involved in the incident, points out that Tesla “now sells more electric automobiles worldwide than any other company.”
“Yet certain features offered on Tesla automobiles, enthusiastically promoted by Tesla, make them extremely dangerous in certain circumstances,” the complaint reads.
Indeed, according to the Register, the complaint accuses the electric car maker of “defects in Tesla’s safety features,” the functionality of which has been “vastly and irresponsibly overstated” to “pump Tesla’s share price and sell more cars.”
The incident in February this year reportedly saw an unnamed driver crashing his Tesla Model X into the back of two parked police cruisers at 70mph (112kph), after they had stopped to investigate a fourth vehicle for suspected narcotics offences.
Are we ready for ready for driverless transport?
Thankfully there were no fatalities as a result of the crash, but the lawsuit alleges the police officers were “badly injured” and require compensation for “severe injuries and permanent disabilities they suffered as a result of the crash”.
The crash was apparently so severe it pushed the parked vehicles into “six people and a German Shepherd.”
There is no word on the nature of the injuries to the police dog or police officers, but Canine Officer Kodiak reportedly “had to visit the vet” while the five officers and a civilian were taken to hospital.
The parked police cruisers “were declared a total loss,” the suit claims.
The lawsuit is seeking damages of $20 million.
“Even though Autopilot was enabled at the time and the police cars had flashing lights, the Tesla failed to engage the Autopilot safety features to avoid the accident,” the lawsuit reportedly adds. “The vehicle did not apply its ‘Automatic Emergency Braking’ to slow down to avoid or mitigate the accident.”
Tesla is being sued for releasing what the lawsuit alleges is an overhyped and malfunctional safety system with a glaring blindspot for emergency vehicles with their flashing lights activated.
A local restaurant is also being sued over allegations the Tesla driver had “consumed alcohol to the point where he was obviously intoxicated, and he presented a clear danger to himself and others” yet “Pappasito’s Cantina continued to serve alcohol to him.”
The driver of the Tesla is, according to the Register, not named in the lawsuit, which it reported is likely owing to an inability to contribute in any meaningful way to the $20m in combined damages sought by the plaintiffs.
“Tesla’s claims [about Autopilot and Automatic Emergency Braking] have been proven to be vastly and irresponsibly overstated, if not outright untrue,” the plaintiffs alleged in their lawsuit.
“Tesla is engaging in systematic fraud to pump Tesla’s share price and sell more cars, while hiding behind disclosures that tell the drivers that the system can’t be relied upon,” the lawsuit alleges. “Tesla knows that Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves, resulting in potentially severe injuries or death.”
While the lawsuit targets Tesla, there is little doubt that the Autopilot system has been abused and misused by drivers on a number of occasions.
In December 2019, a driver was charged after he placed his Tesla Model 3 on autopilot so he could check on his dog in the back seat.
Unfortunately, the Model 3 (whilst in its autonomous driving mode) failed to avoid crashing into a stationary police car of the Connecticut State Police, which had its blue flashing lights on, as it attended to a broken down car.
In September 2020 for example a Tesla driver in Canada was charged when police found the driver and his passenger sleeping in fully reclined seats, whilst the Tesla drove along a highway in autonomous mode at speeds of more than 140kph (86mph).
UK data protection watchdog, the ICO, says encryption provides protections for children, after government-backed campaign…