Tesla Driver Who Hit And Killed Motorcyclist Was Allegedly Looking At His Phone And Using FSD | Carscoops
Authorities say the driver admitted to police that he was inattentive during the crash
2 hours ago
- A Tesla Model S driver admitted relying too much on FSD during a bike crash.
- Tesla disclaims responsibility for crashes if drivers are inattentive.
- Incident reignites debate over safety of Tesla’s Autopilot system.
On April 19, the driver of a Tesla Model S hit a motorcycle and the rider ultimately died on scene. Authorities have now confirmed that the Tesla owner was not just using Full Self-Driving semi-autonomous tech, but he was also looking at his phone when the crash happened. The investigation is still ongoing.
Full Self-Driving comes under fire regularly for its nomenclature and the risk that owners might trust it too much. To a degree, accidents like this one demonstrate that indeed, some drivers misuse the technology. The driver of the Tesla admitted on the scene that he had been distracted and had FSD engaged at the time of the crash.
More: BMW Lets You Netflix And Chill With Level 2 & 3 Autonomy
According to the Seattle Times, a trooper reported the following details about the crash. “The next thing he [the Tesla driver] knew there was a bang and the vehicle lurched forward as it accelerated and collided with the motorcycle in front of him… based on the admitted inattention to driving, while on Autopilot mode, and the distraction of the cell phone while moving forward, putting trust in the machine to drive for him.”
While the trooper wrote Autopilot, the other less advanced version of Tesla’s semi-autonomous driving software, authorities have now used the data recorder in the car to confirm that it was running on FSD. Notably, Tesla now refers to the system as FSD Supervised after several high-profile accidents of this nature.
That word Supervised does a lot of heavy lifting for the automaker. Engaging Autopilot or FSD requires the driver to agree that they’ll remain alert and ready to take over at any time. In theory that would be a trouble-free situation but science confirms that humans are really bad at paying passive attention.
“It’s human nature, really, to just kind of want to zone out and find something exciting to do other than watch the car drive,” Kelly Funkhouser says. She runs automated vehicle testing for Consumer Reports. “It’s just like watching paint dry, right? That’s what worries us most about these systems,” she said to NPR in 2021.
“As they become more competent, then it’s easier for drivers to kind of want to check out and find something else to do.” To that end, this problem with attention and semi-autonomous vehicles is one that extends to all Level 1 and Level 2 systems.