Tesla Driver Blames ‘Malfunctioning’ Self-driving Software After Crash

Tesla's Model S. Image credit: Tesla

Driver claims his Tesla’s Full-Self Driving (FSD) self-driving software malfunctioned after a crash that caused multiple injuries

Tesla’s self-driving software is once again in the spotlight, after a driver directly blamed his vehicle’s Full-Self Driving (FSD) mode after a serious car crash.

According to CNN, the driver of a 2021 Tesla Model S that allegedly triggered a eight-vehicle crash last month on San Francisco’s Bay Bridge told police he was in Full-Self Driving (FSD) mode which had malfunctioned.

According to the police report that was made public on Wednesday, the Tesla driver said his vehicles “full-self-driving” software braked unexpectedly and triggered the eight-car pileup.

Image credit: Tesla
Image credit: Tesla

FSD crash?

CNN reported that the California Highway Patrol had reviewed videos that showed the Tesla vehicle changing lanes and slowing to a stop.

The police report stated that the Tesla Model S was travelling at about 55 mph and had moved into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph.

That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been travelling at typical highway speeds.

California Highway Patrol said in the 7 December report that it could not confirm if “full self-driving” was active at the time of the crash.

And a highway patrol spokesperson told CNN Business on Wednesday that it would not determine if “full self-driving” was active, and Tesla would have that information.

The crash occurred about lunchtime on Thanksgiving, impacting traffic on Interstate 80 east of the Bay Bridge as two lanes of traffic were closed for about 90 minutes as many people travelled to holiday events.

Four ambulances were called to the scene, and two teenagers were reportedly taken to hospital for treatment for minor injuries.

The police report said if Tesla’s FSD beta had malfunctioned, the driver should have manually taken control of the vehicle.

Official investigation

This crash will be noted by US safety officials, amid concerns about Tesla’s self-driving technology and advanced driver assistance systems, despite Elon Musk aggressively hyping the technology.

Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

The National Highway Traffic Safety Administration (NHTSA) launched a formal investigation in August 2021 of Tesla’s self-driving systems, after a series of high profile fatal crashes.

Then in June this year, the NHTSA announced it was upgrading its preliminary investigation to an “engineering analysis”, which is the step taken before the agency determines a recall.

In October it was reported that the US Department of Justice had in 2021 begun a criminal investigation over Tesla’s alleged claims that its electric vehicles (EVs) can drive themselves.

That complaint accused Tesla and Musk of having since 2016 deceptively advertised the technology as fully functioning or “just around the corner”, despite knowing that the technology did not work or was nonexistent, and made vehicles unsafe.

Soon after that, Elon Musk confirmed that the company’s advanced driver assistant software (‘Full Self-Driving’ or FSD) would not satisfy regulatory authorities – at least for this year.

In September Tesla was also hit by a lawsuit, which alleged the EV maker misled the public by falsely advertising its Autopilot and Full Self-Driving features (FSD).

Tesla has said Autopilot enables vehicles to steer, accelerate and brake within their lanes, while Full Self-Driving, which costs an extra $15,000, lets vehicles obey traffic signals and change lanes.

To be fair Tesla has always said that both technologies “require active driver supervision,” with a “fully attentive” driver whose hands are on the wheel, “and do not make the vehicle autonomous.”

Tesla crashes

All of this comes after Tesla EVs have been involved in a series of crashes where its automated driver assistance systems were in use, which included fatal accidents.

For example in September 2021, five police officers in Texas sued Tesla after they, and a police dog, were ‘badly injured’ after an unnamed driver crashed his Tesla Model X into the back of two parked police cruisers at 70mph (112kph), after they had stopped to investigate a fourth vehicle for suspected narcotics offences.

The driver was drunk and had used his Tesla to drive him home, when it crashed and wrecked the police cars, and left the police officers with “severe injuries and permanent disabilities.”

In December 2019 a driver was charged after he placed his Tesla Model 3 on autopilot so he could check on his dog in the back seat.

copyright Connecticut State Police

Unfortunately, the Model 3 (whilst in its autonomous driving mode) failed to avoid crashing into a stationary police car of the Connecticut State Police, which had its blue flashing lights on, as it attended to a broken down car.

Copyright Connecticut State Police

In May 2019 an US National Transportation Safety Board (NTSB) investigation into a fatal Tesla crash in March of that year involving Autopilot found that the technology was engaged for nearly 10 seconds before the crash.

The driver had apparently removed his hands from the wheel about 8 seconds before the crash, and the roof of the Tesla Model X was sheared off and its 50-year-old driver was killed when the vehicle drove under the trailer of a semi truck that was crossing its path in March 2019.

That March incident bore similarities to a May 2016 crash in which a Model S also drove under the trailer of a semi truck crossing its path.

That crash found that autopilot had failed to detect the white trailer against a bright sky.

An additional Tesla accident resulted in the first-ever US case of an individual being charged with vehicular manslaughter in January 2022, when their Model S went through an intersection with Autopilot engaged, striking a Honda Civic and killing two people.