US road safety authorities are investigating the safety of Tesla Autopilot software after a series of accidents involving emergency vehicles.
In each of the eleven crashes identified by the US National Highway Traffic Safety Administration (NHTSA), Tesla vehicles with autopilot engaged hit emergency vehicles and/or other cars involved in roadside emergencies, seemingly unable to process the situation.
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the NHTSA said.
The Office of Defects Investigation noted 17 injuries and one fatality caused by the Tesla vehicles slamming into roadside emergency vehicles, with incidents taking place between 2018 and as recently as last month.
One incident from early 2018 saw a Model S slam into a parked fire truck at around 100km/h.
In July this year, a white Tesla smashed into an unoccupied highway patrol car which had stopped to investigate another traffic accident.
Road safety investigators will “assess the technologies and methods used to monitor, assist, and enforce the driver's engagement with the dynamic driving task during Autopilot operation,” the NHTSA said, including the Object and Event Detection Response (OEDR) and Operational and Design Domain (ODD) of Autopilot systems.
Tesla Models Y, X, S, and 3 from 2014-2021 are included in the defect report, of which the NHTSA estimates there are 765,000 on US roads.
Issues with the vehicles recognising parked fire trucks have been around for years.
Wired reported on precisely this problem in 2018, noting how far fully autonomous self-driving must still be a long way off “when even the best systems available can’t see a big red firetruck”.
Tesla has been closely scrutinised over every fatal crash involving Autopilot, with the company quick to dispute claims Autopilot was at fault, such as the April crash that killed two men when the vehicle went off road and hit a tree.
Similarly, there are concerns around how Tesla enforces requirements that drivers must still be attentive to the road when engaging the Autopilot software.
Finding examples of the system being fooled into thinking a driver has their hands on the wheel is very easy – there are even companies selling anti-nag magnets designed to mimic the force of a human hand resting on the steering wheel.
As the NHSTA investigation begins, the Elon Musk-run company is determined to distribute its upgraded Full Self-Driving mode to more customers as it looks to fulfil its ambitious promise of autonomous vehicles.