The Lowdown Hub

Elon Musk's driverless dreams take hit as Tesla Autopilot accidents pile up

The death of two men traveling in a Model S Tesla has again raised concerns about the brand's safety record



Battling the flames of a burning car for four hours was certainly unusual. But firefighters only grew more confused when they could eventually look inside the vehicle.

Neither of the car’s passengers, two men aged 59 and 69, appeared to have been in the driving seat when it collided with a tree. “No one was driving the vehicle at the time of the crash,” said police constable Mark Herman, discussing Saturday’s fatal accident.

The car, a 2019 Model S Tesla, was manoeuvering a curve when it went off the road for about 100 feet and hit a tree at 11.25 pm local time.

The men’s wives had watched them leave just minutes before the collision, after discussing Tesla’s Autopilot feature, he said.

“This is tragic but this is not a huge surprise,” says Phil Koopman, chief technology officer of Edge Case Research.

“You can see YouTube videos of people in Tesla cars jumping out of the driver’s seat...It’s no surprise that someone got unlucky when they were doing that and if nothing changes, we can expect to see it happen again.”

The crash calls Tesla chief executive Elon Musk’s driverless car dreams into question.

Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View CREDIT: Stringer/REUTERS


Academics say that whether the technology was switched on or not is beside the point. What is more concerning is the trend for Tesla drivers taking their hands off the wheel. “Even if the final analysis blames driver error,” says Koopman, who is also an Associate Professor at Carnegie Mellon University, “that won’t prevent the next death.” Tesla, which recently disbanded its communications team, states in its user manual that drivers should always be behind the wheel and attentive even when its Autopilot feature is on. Tesla says FSD requires a human driver. Yet videos of drivers who are testing Tesla’s driving assistance technology to its limits are easily found on social media. Musk himself broke his own rule by taking his hands off the wheel while in Autopilot mode during a TV interview in 2018. “I don’t know about you, but how many people do you know who read their car user manual,” asks Carla Bailo, president of the Center for Automotive Research in Michigan. Bailo thinks Tesla needs to improve its marketing of what its technology can and cannot do before someone else gets hurt. “If the dealer isn’t making it clear then you need to have a fallback system because you cannot rely on people to read a manual.” Bailo says the definition of features like Autopilot and FSD is important to avoid “misleading” the general public. “Tesla may want to think about changing the term ‘Autopilot’ because that might lead you to imagine exactly what happens in an aeroplane,” she says.

Backseat driver

The other thing that the industry needs to do is assess whether mandatory driver monitoring will be necessary if they want to switch on driving assistance features.

According to Tesla’s user manual, there is a driver occupancy sensor in the driver's seat, which is used when the driver has put the car in neutral and wants to let it park itself.

Tesla has a system that assures people are paying attention to the road. It can sense force on the wheel and signal alerts, eventually slowing the car to a stop if it does not sense a human hand. Professor Koopman is baffled over why Tesla has not used its seat sensor technology in the same way.

“After you [Tesla] have seen a bunch of videos over a period of years of people jumping out of the driver’s seat, and they already have the sensor to tell the drivers, why aren’t they incorporating it into Autopilot monitoring?” he asks.

“There’s no hardware required since it is already in there. Why in the world aren’t they doing this?”

Other cars, like General Motors when in “Super Cruise” mode, use a camera to detect if a driver is distracted.

A recent car safety report published by Tesla, which did not respond to a request for comment on this article, claims that Autopilot is safer than driving, yet the accident rate has not improved in two years. In the first three months of 2020, the average distance per accident was 4.19m miles.

This looks good compared to NHTSA’s crash average of every 484,000 miles, but the average distance per Tesla accident has increased by 10.5pc compared to the year before. Despite this, Musk said his vehicles almost had a "10 times lower" chance of an accident when they had their autopilot engaged than normal cars.

Defining safety

Elon Musk, CEO of Tesla Inc., speaking virtually during the China Development Forum 2021 CREDIT: WU HONG/EPA-EFE/Shutterstock /Shutterstock


The definition of a normal car is critical, says Professor Koopman. “Does he mean 2008 that my wife drives?” Despite high-profile crashes, Tesla was handed a five-star rating for overall safety from NHTSA this year, however, academics have pointed out that the evaluation focuses on traditional auto criteria such as rollover, rather than its autonomous functionalities. “We are lacking a rigorous way to evaluate the safety of intelligent vehicles empowered by artificial intelligence,” says Ding Zhao, an assistant professor at Carnegie Mellon University who works with Uber, Toyota, and Bosch. “Where do we set the bar for safety if 100pc safety is not realistic?”

In 2020, a total of 1,580 people were killed in car accidents and almost 131,220 seriously injured in the UK, according to government statistics. The most common reason for accidents around the world is driver error. On the whole, Zhao thinks that technology that can reduce those lapses is something to be invested in. But companies, academics, and regulators need to work together to come up with a measure for how safe this technology must be to be used on the road, he says. It is easy to assess the braking mechanics, but not the proprietary model's Tesla, nor Uber or General Motors’ software engineers have drawn up. Bailo agrees. “The automakers often think that these systems are their secrets as they want theirs to be the best,” she says. “But there needs to be clarity in terms of what these can and can't do and how the consumer is expected to operate in their vehicle should they choose to be in that mode - and understand that nothing today is self-correcting.”