Lady Gives Birth Whilst Tesla Used Autopilot

Artificial IntelligenceInnovationScience

Tesla Autopilot in the headlines again, as it is engaged by husband to drive car to hospital as his wife gives birth in passenger seat

Tesla’s Autopilot is the brand’s driver’s assistance system, and an usual case of it being used has been reported in the American media.

For years now bad drivers have used Tesla’s Autopilot system for unbelievably stupid reasons, but now at least there was a rare bit of positive news associated with it.

According to the Philadelphia Inquirer, in September Yiran Sherry, 33, and her husband Keating Sherry, 34, were taking their three-year-old son Rafa to pre-school.

Image credit: Tesla
Image credit: Tesla

Tesla baby

According to the report, Sherry was heavily pregnant and her waters broke, and her husband, Keating Sherry, 34, placed her in the passenger front seat.

According to the Philadelphia Inquirer, problems began as her contractions increasing rapidly but the couple remained stuck in traffic.

Keating Sherry reportedly switched on Autopilot after setting the navigation system to the hospital, 20 minutes away.

The husband reportedly laid one hand on the car’s steering wheel (to stop Autopilot disengaging) as he attended to his wife.

“She was squeezing my hand to the point where I thought she was going to shatter it,” Keating Sherry told the Inquirer. “I was [saying] ‘Yiran, OK, focus on your breathing.’ That was advice to myself, as well. My adrenaline was pumping.”

Yiran Sherry reportedly gave birth to her daughter as they arrived at the hospital

Nurses cut the baby’s umbilical cord over the car’s front seat.

Hospital staff reportedly named the baby girl the “Tesla baby”.

Autopilot misuse

This case does raise questions about drivers misusing Tesla’s Autopilot system.

Autopilot is a driver’s assistance system and is not designed to used to drive the vehicle without the driver still being in control.

Previous examples of bad drivers misusing Autopilot in stupid ways includes using Autopilot to take a driver home whilst they are drunk but crash into a stationary police car; using Autopilot to allegedly show off to friends but killing them instead; using Autopilot to drive on a motorway while both driver and passenger slept; and using Autopilot to allow the driver to check on his dog, but instead crash instead into a stationary police car.

Tesla’s Autopilot is thus being investigated by a US safety watchdog, over a number of high profile accidents with emergency service vehicles while Autopilot was being used.

Autopilot differs from Tesla’s more advanced Full Self-Driving capability (or FSD) package that sells for $10,000 or $199 per month in the United States, to a highly select number of people.

Indeed, in order to qualify to join the FSD beta program, drivers must have Driver Safety Scores of 98 and up. Previously, FSD was limited to drivers with perfect 100 scores.

FSD is currently not fully autonomous driving (indeed it is only level 2), and it is not approved by US officials. It still requires a driver behind the wheel paying attention, keeping their hands on the wheel, and being ready to takeover.

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio