Elon Musk’s Tesla admits Full Self-Driving (FSD) beta software may cause crashes, issues recall for 362,758 vehicles
Tesla last week warned that that its experimental driver-assistance software, marketed as Full Self-Driving (FSD) beta, may cause crashes.
To this end, the firm is voluntarily recalling 362,758 vehicles equipped with FSD, in a recall notice was posted on the website of the National Highway Traffic Safety Administration last week.
Elon Musk and Telsa are currently seeking regulatory approval for the company’s advanced driver assistant software, but last December Musk admitted it would not yet satisfy regulatory authorities.
Last year the driver of a 2021 Tesla Model S that allegedly triggered a eight-vehicle crash in November on San Francisco’s Bay Bridge told police he was in Full-Self Driving (FSD) mode which had malfunctioned.
The Tesla driver said his vehicles “full-self-driving” software braked unexpectedly and triggered the eight-car pileup.
The California Highway Patrol had reviewed videos that showed the Tesla vehicle changing lanes and slowing to a stop.
The police report reportedly stated that the Tesla Model S was travelling at about 55 mph and had moved into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph.
That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been travelling at typical highway speeds.
Are we ready for driverless transport?
Tesla recall, admission
Now according to a safety recall report on the website of the National Highway Traffic Safety Administration, Tesla admitted that the FSD Beta system may cause crashes by allowing the affected vehicles to:
“Act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”
The FSD Beta system may also have trouble responding appropriately “to changes in posted speed limits,” the notice added.
The recall of affected vehicles includes the following years and models: 2016-2023 Model S and Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with or pending installation of FSD beta.
It should be noted that the term ‘recall’ can be a bit misleading here, as this safety recall is being fixed with a software update delivered via the Internet.
It is no secret that Tesla is facing challenges with both its ‘Autopilot’ system, as well as its experimental ‘Full Self-Driving’ beta system.
In June 2022 the US federal vehicle safety regulator in the US (NHTSA) said it was upgrading its investigation of Tesla’s Autopilot driving assistance system – the step taken before the agency determines a recall.
Tesla vehicles have allegedly accounted for nearly 70 percent of reported crashes involving advanced driver-assistance systems since June 2021, according to recent federal figures, but officials warned against drawing any safety conclusions.
NHTSA has opened 38 special investigations into crashes involving Tesla vehicles that have resulted in 19 deaths, looking at whether the software was a factor.
Musk has aggressively hyped Tesla’s Autopilot and FSD for years now.
In late 2016 Musk reportedly promised Tesla fans a self-driving car that’s capable of driving from Los Angeles to New York without “the need for a single touch” by the end of 2017.
Then in 2019, Musk raised billions of dollars for Tesla by promising investors the company would have 1 million “robotaxi ready” cars on the road by the end of 2020.
In July 2020, Elon Musk said that Tesla was “very close” to achieving level 5 autonomous driving technology.
Level 5 is the holy grail of autonomous driving technology, as level 5 vehicles will not require human intervention, and need for a human drivers is eliminated.
Indeed, it is said that level 5 cars won’t even need to have steering wheels or acceleration/braking pedals.
These cars will be free from geofencing, and will be able to drive anywhere, and do anything that normal car with a human driver can do.
Tesla cars currently operate at a level-two, which requires the driver to remain alert and ready to act, with hands on the wheel.
Tesla has not helped matters with the naming of its self-driving systems.