Drivers who pay to join Tesla’s Full-Self Driving (FSD) system beta, must now consent to releasing identifiable video footage in case of an accident
Tesla is reportedly requiring drivers to consent to it collecting identifiable video footage taken by the car’s internal and external cameras, in the event of an accident.
The consent to video collection does not apply to all drivers, but only those drivers who opt into Tesla’s FSD beta program.
FSD is separate to Tesla’s Autopilot driver-assistance system, that is currently being investigated by a US safety watchdog, over a number of high profile accidents with emergency service vehicles while Autopilot was being used.
Tesla sells the more advanced Full Self-Driving capability (or FSD) package for $10,000 or $199 per month in the United States, to a highly select number of people.
Indeed, in order to qualify to join the FSD beta program, drivers must have Driver Safety Scores of 98 and up. Previously, FSD was limited to drivers with perfect 100 scores.
FSD is currently not fully autonomous driving (indeed it is only level 2), and it is not approved by US officials. It still requires a driver behind the wheel paying attention, keeping their hands on the wheel, and being ready to takeover.
But now according to a report by Electrek, Tesla is asking FSD drivers to consent to allowing the car maker to collect video taken by a car’s exterior and interior cameras in case of an accident or “serious safety risk.”
The report stated that while Tesla had previously gathered video footage as part of FSD before, it was only used to train and improve its AI self-driving systems. And the electric car giant always ensured that the footage was anonymous and never associated with a person’s vehicle.
But now this has changed, and testers of the FSD beta program have to accept that Tesla can use footage from both inside and outside the car in case of a safety risk or accident.
Tesla revealed the change when it updated its terms and conditions warning, when testers download a new version of the FSD Beta.
“By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision,” Electrek reported the new warning as stating.
Electrek notes that important part of the warning was the “VIN-associated” wording, which means that the footage collected will be associated with the owners’ vehicle.
It pointed out that Tesla’s language could indicate the firm wants to ensure it has evidence in case its FSD system are blamed for an accident.
That said, it could possibly also be used to detect and fix serious issues more quickly as well.