Lady Gives Birth Whilst Tesla Used Autopilot

Tesla’s Autopilot is the brand’s driver’s assistance system, and an usual case of it being used has been reported in the American media.

For years now bad drivers have used Tesla’s Autopilot system for unbelievably stupid reasons, but now at least there was a rare bit of positive news associated with it.

According to the Philadelphia Inquirer, in September Yiran Sherry, 33, and her husband Keating Sherry, 34, were taking their three-year-old son Rafa to pre-school.

Image credit: Tesla

Tesla baby

According to the report, Sherry was heavily pregnant and her waters broke, and her husband, Keating Sherry, 34, placed her in the passenger front seat.

According to the Philadelphia Inquirer, problems began as her contractions increasing rapidly but the couple remained stuck in traffic.

Keating Sherry reportedly switched on Autopilot after setting the navigation system to the hospital, 20 minutes away.

The husband reportedly laid one hand on the car’s steering wheel (to stop Autopilot disengaging) as he attended to his wife.

“She was squeezing my hand to the point where I thought she was going to shatter it,” Keating Sherry told the Inquirer. “I was [saying] ‘Yiran, OK, focus on your breathing.’ That was advice to myself, as well. My adrenaline was pumping.”

Yiran Sherry reportedly gave birth to her daughter as they arrived at the hospital

Nurses cut the baby’s umbilical cord over the car’s front seat.

Hospital staff reportedly named the baby girl the “Tesla baby”.

Autopilot misuse

This case does raise questions about drivers misusing Tesla’s Autopilot system.

Autopilot is a driver’s assistance system and is not designed to used to drive the vehicle without the driver still being in control.

Previous examples of bad drivers misusing Autopilot in stupid ways includes using Autopilot to take a driver home whilst they are drunk but crash into a stationary police car; using Autopilot to allegedly show off to friends but killing them instead; using Autopilot to drive on a motorway while both driver and passenger slept; and using Autopilot to allow the driver to check on his dog, but instead crash instead into a stationary police car.

Tesla’s Autopilot is thus being investigated by a US safety watchdog, over a number of high profile accidents with emergency service vehicles while Autopilot was being used.

Autopilot differs from Tesla’s more advanced Full Self-Driving capability (or FSD) package that sells for $10,000 or $199 per month in the United States, to a highly select number of people.

Indeed, in order to qualify to join the FSD beta program, drivers must have Driver Safety Scores of 98 and up. Previously, FSD was limited to drivers with perfect 100 scores.

FSD is currently not fully autonomous driving (indeed it is only level 2), and it is not approved by US officials. It still requires a driver behind the wheel paying attention, keeping their hands on the wheel, and being ready to takeover.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Microsoft Beats Expectations Thanks To AI Investments

Customer adoption of AI services embedded in cloud services continues to deliver results for Microsoft,…

1 hour ago

Meta Declines On Heavy AI Spending Plans, Despite Strong Q1

Share price hit after Meta admits heavy AI spending plans, after posting strong first quarter…

23 hours ago

Google Delays Removal Of Third-Party Cookies, Again

For third time Google delays phase-out of third-party Chrome cookies after pushback from industry and…

24 hours ago

Tesla Posts Biggest Revenue Drop Since 2012

Elon Musk firm touts cheaper EV models, as profits slump over 50 percent in the…

1 day ago