The Lowdown Hub

Elon Musk's driverless dreams take hit as Tesla Autopilot accidents pile up

The death of two men traveling in a Model S Tesla has again raised concerns about the brand's safety record



Battling the flames of a burning car for four hours was certainly unusual. But firefighters only grew more confused when they could eventually look inside the vehicle.

Neither of the car’s passengers, two men aged 59 and 69, appeared to have been in the driving seat when it collided with a tree. “No one was driving the vehicle at the time of the crash,” said police constable Mark Herman, discussing Saturday’s fatal accident.

The car, a 2019 Model S Tesla, was manoeuvering a curve when it went off the road for about 100 feet and hit a tree at 11.25 pm local time.

The men’s wives had watched them leave just minutes before the collision, after discussing Tesla’s Autopilot feature, he said.

“This is tragic but this is not a huge surprise,” says Phil Koopman, chief technology officer of Edge Case Research.

“You can see YouTube videos of people in Tesla cars jumping out of the driver’s seat...It’s no surprise that someone got unlucky when they were doing that and if nothing changes, we can expect to see it happen again.”

The crash calls Tesla chief executive Elon Musk’s driverless car dreams into question.

Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View CREDIT: Stringer/REUTERS


Academics say that whether the technology was switched on or not is beside the point. What is more concerning is the trend for Tesla drivers taking their hands off the wheel. “Even if the final analysis blames driver error,” says Koopman, who is also an Associate Professor at Carnegie Mellon University, “that won’t prevent the next death.” Tesla, which recently disbanded its communications team, states in its user manual that drivers should always be behind the wheel and attentive even when its Autopilot feature is on. Tesla says FSD requires a human driver. Yet videos of drivers who are testing Tesla’s driving assistance technology to its limits are easily found on social media. Musk himself broke his own rule by taking his hands off the wheel while in Autopilot mode during a TV interview in 2018. “I don’t know about you, but how many people do you know who read their car user manual,” asks Carla Bailo, president of the Center for Automotive Research in Michigan. Bailo thinks Tesla needs to improve its marketing of what its technology can and cannot do before someone else gets hurt. “If the dealer isn’t making it clear then you need to have a fallback system because you cannot rely on people to read a manual.” Bailo says the definition of features like Autopilot and FSD is important to avoid “misleading” the general public. “Tesla may want to think about changing the term ‘Autopilot’ because that might lead you to imagine exactly what happens in an aeroplane,” she says.

Backseat driver