Musk: Tesla Is Not Recognized For ‘Lives Saved’ By Autopilot

0

Elon Musk was named 2021 Personality of the Year by Time and spoke with the magazine’s editor-in-chief and CEO Edward Felsenthal about his work, including accidents caused by the Autopolot technology of his electric car company, Tesla.

The company had its autonomous driving technology associated with 12 accidents in 2018, which resulted in legal actions from customers. Some lawsuits are for fatal accidents, others seek redress against Tesla for misrepresenting and using deceptive marketing about its Autopilot and Full Self-Driving services.

Speaking on the subject, Musk said the technology is remembered for accidents, but not highlighted for “lives saved” during its use. “Someone told me early on when we were looking for autonomy: even if you save 90% of lives, the 10% you don’t save will sue you,” Musk told the magazine.

“I think it’s one of those things where you won’t necessarily be rewarded for the lives you save, but you’ll definitely be blamed for the lives you don’t save.” The businessman also points out that notable improvements were observed monthly in the autonomous driving capabilities of vehicles recently.

Differently from what you might think, Tesla’s autopilot is designed to help steering, but it doesn’t make the car run on its own and still requires the driver’s full attention to avoid accidents. When activated, the technology is responsible for monitoring the surrounding environment and keeping the car centered on the track at a safe distance.

Under investigation

According to Tesla’s Vehicle Safety Report, in the second quarter of 2021 alone, the company recorded one collision for every 4.41 million miles, about 7 million kilometers, using Autopilot technology. In Tesla cars without autopilot, the number of collisions increases to one in 1.2 million miles, or 1.92 million kilometers.

The National Highway Traffic Safety Administration (NHTSA) of the United States began investigating in August 765,000 Tesla vehicles that have been produced since 2014. The investigation began after the Full Self-Driving software caused 11 accidents , leading one person to death and injuring others 17.

NHTSA sent letters to Tesla in September asking for more information about non-disclosure agreements with owners, as well as asking the company to recall vehicles when a software update was needed to fix security holes. The company updated Autopilot as soon as the investigation began, in an attempt to fix problems.