NTSB says Tesla Autopilot was partly to blame for 2018 crash
The National Transportation Safety Board (NTSB) cited both driver error and Tesla’s Autopilot design as the probable causes of a January 2018 crash, in which a Model S slammed into a parked fire truck at about 31 mph. According to the report, the driver was distracted and did not see the fire truck. But NTSB says that Tesla’s Autopilot was also at fault, as its design “permitted the driver to disengage from the driving task.”
The driver reportedly had Autopilot engaged and was following closely behind a large SUV or truck. The lead vehicle changed lanes to move around a fire truck that was parked in the lane ahead. The Tesla driver claimed he was drinking coffee and eating a bagel and did not see the firetruck. When the lead vehicle changed lanes, the Model S accelerated. About .49 seconds before the crash, the vehicle detected a stationary object in the road and displayed a warning, but it was too late.
The vehicle’s Autopilot didn’t detect driver-applied steering wheel torque for the last three minutes and 41 seconds before the crash. And given the driver’s admitted distractions, NTSB says the driver was likely over-reliant on the vehicle’s driver assistance system.
Driver errors, Advanced Driver Assistance system Design, led to Jan. 22, 2018, Culver City, CA, highway crash, according to NTSB Highway Accident Brief 19/07 issued Wednesday; https://t.co/hozLB1zA7F pic.twitter.com/iyQNc2HdhT
— NTSB_Newsroom (@NTSB_Newsroom) September 4, 2019
According to Reuters, the Center for Auto Safety, a consumer watchdog group, said the NTSB report should prompt the National Highway Traffic Safety Administration (NHTSA) to “do its job and recall these vehicles … A vehicle that enables a driver to not pay attention, or fall asleep, while accelerating into a parked fire truck is defective and dangerous.” In a statement provided to Engadget, a Tesla spokesperson said, “Tesla owners have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance.”
Tesla’s Autopilot was engaged in at least three fatal US crashes, two of which are still under investigation by the NTSB and NHTSA. Oddly enough, the January 2018 crash wasn’t the only time an admittedly distracted driver crashed into the back of a fire truck while using Autopilot. The system does issue “hands on warnings” and Tesla advises drivers to keep their hands on the wheel. But as others have pointed out, calling the driver assist features “Autopilot” may be a bit misleading.
Update 9/5/2019 1:25PM ET: This story was updated to include a statement from Tesla.
(15)