FYI.

This story is over 5 years old.

Tech

Drivers Use Tesla Autopilot at Their Own Risk, Investigators Conclude

“I think the really difficult case is going to be a case in which some third party is hurt."

After a six month long inquest, the National Highway Traffic Safety Administration (NHTSA) today closed its investigation into a fatal accident involving a Tesla vehicle in autopilot mode with the conclusion that no design error occurred, precluding the need for a product recall and clearing Tesla of fault in the incident.

The fatal collision occurred on May 7th, 2016, when 40-year-old Joshua Brown's Tesla Model S hit an 18-wheel truck and trailer that was turning left at an intersection. A statement from Tesla at the time said that an exceptional set of circumstances - an unusually tall white trailer against a bright sky - meant that the vehicle's computer vision system did not detect the crossing truck, and the car passed under the trailer at full speed, tearing the roof off.

Advertisement

According to the report, the examination:

"[D]id not identify any defects in the design or performance of the AEB or Autopilot systems of the subject vehicles… The Autopilot system is an Advanced Driver Assistance System (ADAS) that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes."

The truck driver's testimony from the crash suggested that Brown may have been watching a DVD at the time, ignoring Tesla's instructions that drivers should remain attentive and keep hands on the wheel when autopilot mode is engaged. (Tesla's own press kit describes the autopilot function as "a hands-on experience to give drivers more confidence behind the wheel.")

Joshua Brown had an active YouTube page showing off Tesla's Autopilot feature.

According to Ryan Calo, a law professor at the University of Washington specializing in robotics law, the NHTSA's decision reflects the principle of 'assumption of risk': that the affected party knowingly accepted the dangers of the activity concerned, meaning that the manufacturer was not at fault.

"I think the really difficult case is going to be a case in which some third party is hurt," said Calo. "If someone has a Tesla on autopilot, is not paying attention and runs into a third party, Tesla is not going to be able to argue assumption of risk in that instance … so that will pose a very difficult legal problem for Tesla in my opinion."

In the case of such an incident, Calo suggested, courts would not be bound by any precedent from today's findings.

In a tweet, Elon Musk described the report's finding as "very positive", later quoting figures from the report that showed a significant reduction in crash rate of Tesla vehicles after the Autosteer feature was released.

Report highlight: "The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation."
— Elon Musk (@elonmusk) January 19, 2017

Tragic accidents notwithstanding, polls suggest that overall the American public is slowly becoming more accepting of self-driving vehicles: a recent survey found that 23 percent of adults said they would ride in an autonomous vehicle now, with 42 percent saying they would consider doing so in future.