Federal transportation officials suggest that problems with the design of Tesla’s “autopilot”, or self-driving, features may have been a contributing factor in a January 2018 accident, when an inattentive was permitted to disengage entirely from operating a Model S vehicle.
The National Transportation Safety Board (NTSB) released a new report this week, which indicates there is probable cause to believe the advanced driver assistance system (ADAS) failed to recognize a hazard ahead and automatically apply the brakes. This, along with driver error, likely caused the collision with a stationary fire truck, the board concluded.
In January 2018, a 2014 Tesla Model S crashed into the back of a Culver City, California fire truck on Interstate 405 near Los Angeles. The fire truck was parked diagonally over the shoulder and left lane of the highway with its emergency lights flashing while firefighters handled a different crash.
According to the investigation, the crash occurred after a larger vehicle ahead of the Tesla moved out of its lane to avoid the upcoming fire truck. With the ADAS system engaged, the Model S reportedly began to speed up and did not recognize the parked fire truck hazard ahead, causing the two vehicle to collide at approximately 31 miles per hour. No one was injured.
The NTSB began probing the crash after the Tesla’s driver reported the autopilot system was activated, but failed to engage the brakes. He also reported there were no audible or visual indicators given to alert the driver of a hands-on warning.
Investigators say the probable cause of the rear-end collision was a combination of the driver’s lack of response to the fire engine due to inattention and over-reliance on the autopilot system, and a flaw in the system which did not recognize the hazard, engage the breaks, or warn the driver manual intervention was required.
The Center for Auto Safety, a consumer watchdog group, said on Wednesday the NTSB report should prompt the National Highway Traffic Safety Administration (NHTSA) to recall the vehicles, because the system, as described, is defective and dangerous.
NTSB officials are currently investigating at least three fatal U.S. crashes involving Tesla vehicles that were equipped with the Autopilot system.
Self-Driving System Debate
Autopilot or self-driving systems require several different safety features to work together to protect not only the driver but also surrounding vehicle occupants and pedestrians.
With as much as 94% of all roadway accidents being the result of human error, these systems aim to significantly reduce or eliminate these types of errors through features such as adaptive cruise control, pedestrian crash avoidance mitigation systems (PCAM), lane departure warning, automatic lane centering, blind spot warnings and automatic braking systems (ABS).
Although there is a major push for self-driving systems to be universal among all new vehicles, and self-driving technologies have shown to play a major role in mitigating crashes and preventing injuries over the last several years, researchers have raised concerns about the potential changes in driving behavior and the lack of basic driver safety efforts due to relying on the systems.
In an October 2018 study by the American Automobile Association (AAA), researchers indicate that at least a third of individuals using modern crash avoidance systems have become reliant upon them, and fail to follow normal driving safety procedures.
Researchers noted from survey data that while the technology is helpful to prevent crashes when a driver makes an error, crash avoidance systems have not eliminated the need for drivers to be aware of their surroundings and drive safely. Researchers reported finding some evidence that drivers are adapting to the self-driving systems in an unsafe fashion which could cause crashes, injuries or fatalities if the systems fail.