US Regulator Opens Official Investigation Of Tesla Autopilot

Tesla is facing an official investigation of its Autopilot driving assistance system, after concerns were raised following a number of high profile accidents.

In June this year, the National Highway Traffic Safety Administration (NHTSA) had ordered all car makers equipping their vehicles with automated driving systems, to begin reporting crashes so the US regulator could “collect information necessary for the agency to play its role in keeping Americans safe on the roadways.”

Prior to that in November 2020, the NHTSA began a public consultation on ways to improve the safety of ‘self-driving’ cars.

Tesla’s Model 3. Image credit: Tesla

Tesla Autopilot

During the last five years, car makers (including Tesla) have been at the centre of a number of incidents surrounding the use of automated driving systems, including multiple accidents and indeed fatalities.

The Daily Telegraph this week for example reported that six children and an adult were injured in a crash involving a Tesla car in the car park of a public school in Sussex.

There was thought to be one person in the Tesla Model 3 car, but it is not known if there was a driver behind the wheel at the time of the collision. It is also not known is the Model 3’s automated driving system was involved.

Are we ready for ready for driverless transport?

Now documents filed by the NHTSA this week show the US car safety regulator is investigating 11 crashes of Tesla vehicles.

“Since January 2018, the Office of Defects Investigation (ODI) has identified eleven crashes in which Tesla models of various configurations have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes,” said the NHTSA report.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” it added. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

Previous crashes

In December 2019 for example, a driver was charged after he placed his Tesla Model 3 on autopilot so he could check on his dog in the back seat.

Unfortunately, the Model 3 (whilst in its autonomous driving mode) failed to avoid crashing into a stationary police car of the Connecticut State Police, which had its blue flashing lights on, as it attended to a broken down car.

Copyright Connecticut State Police

In May 2019 an US National Transportation Safety Board (NTSB) investigation into a fatal Tesla crash in March involving Autopilot found that the self-driving technology was engaged for nearly 10 seconds before the crash.

The driver had apparently removed his hands from the wheel about 8 seconds before the crash, and the roof of the Tesla Model X was sheared off and its 50-year-old driver was killed when the vehicle drove under the trailer of a semi truck that was crossing its path in March 2019.

That March incident has similarities to a May 2016 crash in which a Model S also drove under the trailer of a semi truck crossing its path.

That crash found that autopilot had failed to detect the white trailer against a bright sky.

Careless drivers?

But is fair to say that it seems on the surface that Tesla’s Autopilot system has been abused and misused by drivers on a number of occasions.

In September 2020 for example a Tesla driver in Canada was charged when police found the driver and his passenger sleeping in fully reclined seats, whilst the Tesla drove along a highway in autonomous mode at speeds of more than 140kph (86mph).

Then in May this year the NTSB issued its preliminary findings of aTesla car crash in Texas that killed two men.

And while its investigation is still continuing, and there is no conclusion about what caused the crash, the NTSB seems to back Elon Musk’s insistence that the Tesla Autopilot system was not engaged at the time of the accident, which happened when a 2019 Tesla Model S burst into flames, after it crashed into a tree north of Houston in April 2021.

What made this crash notable was that no one was apparently behind the wheel of the car, according to local police.

Indeed, local police said they are 100 percent certain no one was driving, and insist the two dead male passengers may have potentially been utilising Autopilot (Tesla’s semi-automated driving system) in an extremely unsafe (i.e. idiotic) manner.

The police said the body of one passenger was located in the front passenger seat, while the other was located in the back seat of the Tesla.

Elon Musk was quick to cast doubt on that law enforcement theory, when he said data recovered so far showed Autopilot was not enabled.

However engineers at influential US magazine Consumer Reports (CR) then demonstrated how easy it is to defeat the Autopilot driver monitoring system.

Consumer Reports said its engineers easily tricked a Tesla Model Y so that it could drive on Autopilot, without anyone in the driver’s seat – a scenario that would present extreme danger if it were repeated on public roads.

In late May Tesla confirming a change to its Autopilot self-driving technology, with the electric car maker dropping the use of radar. Instead Tesla will utilise a camera-focused Autopilot system for its Model 3 and Model Y vehicles in North America.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Creating Deepfake Porn Without Consent To Become A Crime

People who create sexually explicit ‘deepfakes’ of adults will face prosecution under a new law…

13 hours ago

Google Fires 28 Staff Over Israel Protest, Undertakes More Layoffs

Protest at cloud contract with Israel results in staff firings, in addition to layoffs of…

14 hours ago

Russia Already Meddling In US Election, Microsoft Warns

Microsoft warns of Russian influence campaigns have begun targetting upcoming US election, albeit at a…

15 hours ago

EU To Drop Microsoft’s OpenAI Investment Probe – Report

Microsoft to avoid an EU investigation into its $13 billion investment in OpenAI, after EC…

18 hours ago

US Provides Assurances For Julian Assange Extradition

As President Biden 'considers' request to drop Julian Assange extradition, US provides assurances to prevent…

20 hours ago