Lady Gives Birth Whilst Tesla Used Autopilot

Tesla’s Autopilot is the brand’s driver’s assistance system, and an usual case of it being used has been reported in the American media.

For years now bad drivers have used Tesla’s Autopilot system for unbelievably stupid reasons, but now at least there was a rare bit of positive news associated with it.

According to the Philadelphia Inquirer, in September Yiran Sherry, 33, and her husband Keating Sherry, 34, were taking their three-year-old son Rafa to pre-school.

Image credit: Tesla

Tesla baby

According to the report, Sherry was heavily pregnant and her waters broke, and her husband, Keating Sherry, 34, placed her in the passenger front seat.

According to the Philadelphia Inquirer, problems began as her contractions increasing rapidly but the couple remained stuck in traffic.

Keating Sherry reportedly switched on Autopilot after setting the navigation system to the hospital, 20 minutes away.

The husband reportedly laid one hand on the car’s steering wheel (to stop Autopilot disengaging) as he attended to his wife.

“She was squeezing my hand to the point where I thought she was going to shatter it,” Keating Sherry told the Inquirer. “I was [saying] ‘Yiran, OK, focus on your breathing.’ That was advice to myself, as well. My adrenaline was pumping.”

Yiran Sherry reportedly gave birth to her daughter as they arrived at the hospital

Nurses cut the baby’s umbilical cord over the car’s front seat.

Hospital staff reportedly named the baby girl the “Tesla baby”.

Autopilot misuse

This case does raise questions about drivers misusing Tesla’s Autopilot system.

Autopilot is a driver’s assistance system and is not designed to used to drive the vehicle without the driver still being in control.

Previous examples of bad drivers misusing Autopilot in stupid ways includes using Autopilot to take a driver home whilst they are drunk but crash into a stationary police car; using Autopilot to allegedly show off to friends but killing them instead; using Autopilot to drive on a motorway while both driver and passenger slept; and using Autopilot to allow the driver to check on his dog, but instead crash instead into a stationary police car.

Tesla’s Autopilot is thus being investigated by a US safety watchdog, over a number of high profile accidents with emergency service vehicles while Autopilot was being used.

Autopilot differs from Tesla’s more advanced Full Self-Driving capability (or FSD) package that sells for $10,000 or $199 per month in the United States, to a highly select number of people.

Indeed, in order to qualify to join the FSD beta program, drivers must have Driver Safety Scores of 98 and up. Previously, FSD was limited to drivers with perfect 100 scores.

FSD is currently not fully autonomous driving (indeed it is only level 2), and it is not approved by US officials. It still requires a driver behind the wheel paying attention, keeping their hands on the wheel, and being ready to takeover.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

GenAI Integration Efforts Hampered By Costs, SnapLogic Finds

Hefty investment. SnapLogic research finds UK businesses are setting aside three-quarters of their IT budgets…

4 mins ago

Meta Refuses EU Release Of Multimodal Llama AI Model

Mark Zuckerberg firm says European regulatory environment too ‘unpredictable’, so will not release multimodal Llama…

2 hours ago

Synchron Announces Brain Interface Chat Powered by OpenAI

Brain implant firm Synchron offers AI-driven emotion and language predictions for users, powered by OpenAI's…

3 hours ago

Amazon Workers In Coventry Fail To Form Union

Amazon workers in Coventry lose union recognition ballot by just a handful of votes, amid…

7 hours ago

US Considers Further Chip Restrictions For China – Report

Stop supplying Beijing. US tells allied chip tech firms it is mulling the most severe…

8 hours ago