Influential US magazine Consumer Reports (CR) had waded into the mystery surrounding the death of two men in Texas in a ‘driverless’ Tesla car accident.
A week ago, a 2019 Tesla Model S burst into flames, after it crashed into a tree north of Houston, with no one behind the wheel of the car.
Authorities insist the two dead male passengers may have potentially been utilising Autopilot (Tesla’s semi-automated driving system) in an extremely unsafe (i.e. idiotic) manner.
The police said the body of one passenger was located in the front passenger seat, while the other was located in the back seat of the Tesla.
This week however Tesla boss Elon Musk cast doubt on that theory, when he said data recovered so far showed Autopilot was not enabled.
Authorities are seeking to obtain this data from Tesla.
Musk had been responding to a person’s tweet that questioned the official account, who said the police version did not make sense as Tesla safety measures in place with the Autopilot Seat is weighted to make sure there is a driver, plus hands must be on steering wheel every 10 seconds or it disengages.
But now Consumer Reports has reported that its engineers have demonstrated how easy it is to defeat Autopilot’s driver monitoring system.
Consumer Reports said its engineers easily tricked a Tesla Model Y this week so that it could drive on Autopilot, without anyone in the driver’s seat – a scenario that would present extreme danger if it were repeated on public roads.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” said Jake Fisher, CR’s senior director of auto testing, who conducted the experiment. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.”
Consumer Reports investigation comes as federal and local investigators continue to probe the cause of a fatal crash Saturday in Texas.
The magazine said it had tried to reach Tesla to ask about the Texas crash but did not hear back.
CR explained it wanted to see whether it could prompt its own Tesla to drive down the road without anyone in the driver’s seat.
So Fisher and Kelly Funkhouser, CR’s program manager for vehicle interface testing, took a 2020 Tesla Model Y out on a test track.
Funkhouser sat in the rear seat, and Fisher sat in the driver seat on top of a buckled seat belt. This is because Autopilot will disengage if the driver’s seat belt is unbuckled while the vehicle is in motion.
Fisher then engaged Autopilot while the car was in motion on the track, then set the speed dial (on the right spoke of the steering wheel) to 0, which brought the car to a complete stop.
Fisher next placed a small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand, and slid over into the front passenger seat without opening any of the vehicle’s doors, because that would disengage Autopilot.
Using the same steering wheel dial, which controls multiple functions in addition to Autopilot’s speed, Fisher reached over and was able to accelerate the vehicle from a full stop. I
t seems the seat weight sensor did not kick in.
He then stopped the vehicle by dialing the speed back down to zero.
A video of the CR test, showing this, can be found here.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” said Fisher. “It was a bit frightening when we realised how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
CR warned that under no circumstance should anyone try this.
Tesla of course recommends users be prepared to take over from autopilot at all times (and of course be in the drivers seat), and that drivers do not remove their hands from the steering wheel whilst autopilot is engaged.
But it has not stopped some reckless drivers from previously exploiting the Autopilot system.
In September 2020, a Tesla driver in Canada was charged when police found the Canadian driver and his passenger sleeping in fully reclined seats, whilst the Tesla drove along a highway in autonomous mode at speeds of more than 140kph (86mph).
This particular driver got around Tesla’s safety system requiring the steering wheel to be toggled or adjusted every 20 seconds by attaching a weight to the steering wheel to trick the car’s systems – just like the CR engineers did in their experiment.
Fisher and Funkhouser pointed out that this trick would not be possible with driverless systems from other car makers that utilise a driver-facing camera to make sure the driver is alert, present, and paying attention to the road ahead.
CR therefore recommends some changes to Tesla’s system, saying the car maker could use the weight sensor in the vehicle’s driver’s seat to determine whether there is actually a human sitting behind the wheel in order for Autopilot to work.
These sensors are already used for seat belt warnings and airbags, among other things, so it wouldn’t be a major leap to program a vehicle to turn off features like cruise control if it senses that the driver’s seat is empty, Funkhouser said.
A new low. International Committee of the Red Cross shuts down reunification system, after hackers…