Tesla has started to ask for authorization from beta users of the Full Self-Driving (FSD) autonomous driving system so that the cars’ internal and external cameras can record videos in cases of safety risks or accidents, linking the recordings to drivers and specific cars.
Previously, the company already used videos for the machine learning system, needed to improve the software and its tools, but they were anonymous.
“By enabling FSD Beta, I provide permission for the collection of images associated with the Vehicle Identification Number by external and cabin cameras in the event of a serious hazard or safety event such as a collision,” the terms of service state of the new beta version (10.5).
According to the Electrek portal, the change could mean that Tesla seeks to secure evidence during cases where the FSD is accused of being responsible for the accident.
Assistant, not self-employed
It is noteworthy that the US National Highway Traffic Safety Administration (NHTSA), the country’s motor vehicle standardization agency, opened an investigation in August into the autopilot. In addition, the institution is also analyzing a complaint involving the FSD and the collision of a Model Y in early November. According to the driver’s account, the software forced the car into the wrong direction and made it impossible for the victim to take control.
Currently, Tesla is making the new version of the test available to individuals with a Safety Score above 98 — a company metric for measuring day-to-day driver behavior. In addition to being an exemplary driver, you must purchase the trailer for a monthly fee of US$199, or for a one-time payment of US$10,000.
Despite its name, the FSD is still considered a Tier 2 driver assistance system, being able to control steering, braking and acceleration under specific circumstances, yet requiring proprietary attention.