Tesla is reportedly requiring drivers to consent to it collecting identifiable video footage taken by the car’s internal and external cameras, in the event of an accident.

The consent to video collection does not apply to all drivers, but only those drivers who opt into Tesla’s FSD beta program.

FSD is separate to Tesla’s Autopilot driver-assistance system, that is currently being investigated by a US safety watchdog, over a number of high profile accidents with emergency service vehicles while Autopilot was being used.

Image credit: Tesla

FSD beta

Tesla sells the more advanced Full Self-Driving capability (or FSD) package for $10,000 or $199 per month in the United States, to a highly select number of people.

Indeed, in order to qualify to join the FSD beta program, drivers must have Driver Safety Scores of 98 and up. Previously, FSD was limited to drivers with perfect 100 scores.

FSD is currently not fully autonomous driving (indeed it is only level 2), and it is not approved by US officials. It still requires a driver behind the wheel paying attention, keeping their hands on the wheel, and being ready to takeover.

But now according to a report by Electrek, Tesla is asking FSD drivers to consent to allowing the car maker to collect video taken by a car’s exterior and interior cameras in case of an accident or “serious safety risk.”

The report stated that while Tesla had previously gathered video footage as part of FSD before, it was only used to train and improve its AI self-driving systems. And the electric car giant always ensured that the footage was anonymous and never associated with a person’s vehicle.

Identifiable video

But now this has changed, and testers of the FSD beta program have to accept that Tesla can use footage from both inside and outside the car in case of a safety risk or accident.

Tesla revealed the change when it updated its terms and conditions warning, when testers download a new version of the FSD Beta.

“By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision,” Electrek reported the new warning as stating.

Electrek notes that important part of the warning was the “VIN-associated” wording, which means that the footage collected will be associated with the owners’ vehicle.

It pointed out that Tesla’s language could indicate the firm wants to ensure it has evidence in case its FSD system are blamed for an accident.

That said, it could possibly also be used to detect and fix serious issues more quickly as well.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

US Awards $6.4bn To Samsung For Expanded Texas Chip Production

US awards $6.5bn to Samsung Electronics under Chips Act as it seeks to expand domestic…

7 hours ago

Tesla Cuts More Than 10 Percent Of Workforce

Tesla lays off more than 10 percent of staff worldwide amidst slower growth, tougher competition…

14 hours ago

Huawei Building Massive Chip R&D Centre In Shanghai

Huawei now developing own chip manufacturing technology as it seeks ways around increasing US restrictions

20 hours ago

Deepfakes: More Than Skin Deep Security

As deepfake technology continues to blur the lines between reality and deception, businesses and individuals…

20 hours ago

Huawei To Sell Laptop With Latest Intel Core Ultra AI Chip

US-sanctioned Huawei to sell MateBook Pro X model powered by latest Intel Core Ultra 9…

21 hours ago

OpenAI Fires Two Researchers Over Information Leaks

OpenAI fires two researchers for leaking information in first known shake-up since Sam Altman briefly…

21 hours ago