Autopilot Accident Not Caused By Vehicle Safety Defect, NHTSA Determines
The National Highway Traffic Safety Administration (NHTSA) has determined that there is no evidence a recent autopilot accident involving a Tesla was caused by a safety defect, indicating that there is no need for a recall or regulatory action at this time.
An investigation (PDF) was launched into the first fatal accident involving a self-driving vehicle on June 28, following a crash involving a Tesla Model S on May 7, in which Joshua Brown was killed after the vehicle crossed into an uncontrolled intersection and collided with a tractor trailer. The death sparked concerns over the safety of autopilot technology, and whether drivers and passengers could be at risk from safety problems with the systems.
The data pulled from the vehicle minutes before the crash indicates Brown was operating the vehicle in autopilot mode at the time of the collision, and the Automated Emergency Braking (AEB) system did not provide any warning or automated braking for the collision event. The data indicated Brown took no braking, steering, or other actions to avoid the collision.
Did You Know?
Millions of Philips CPAP Machines Recalled
Philips DreamStation, CPAP and BiPAP machines sold in recent years may pose a risk of cancer, lung damage and other injuries.Learn More
According to the NHTSA’s autopilot accident evaluation, Brown “should have been able to take some action before the crash, like braking, steering or attempting to avoid the vehicle.” The crash scene investigators determined the truck should have been visible to Brown for at least seven seconds before impact.
Investigators searched for potential defects in the design and execution of the AEB and Autopilot designs. However, investigators were not able to determine the systems failed in any way after reviewing the accident. According to the report, the Model S AEB systems are designed to prevent rear-end collisions and are not to be designed to fully or reliably mitigate or avoid all crash scenarios, including path crossing collisions.
After reviewing the Autopilot system, investigators found the Advanced Driver Assistance System (ADAS) still requires the continual and full attention of the driver to monitor surrounding traffic and perform some level of action to avoid Autopilot crashes.
The NHTSA’s Office of Defects Investigations (ODI) warns that drivers must be fully aware of the semi-autonomous driving vehicles capabilities and understand that driver action for certain crash scenarios are required and avoidance cannot be fully relied on by the vehicle itself.
The findings of the investigation still raise questions about the abilities of autonomous vehicles to recognize driver misuse of the system. Despite numerous attempts to subpoena and request driver error safeguard information from Tesla, and its supplier Mobileye, no direct answers were received, investigators noted. Separate of the NHTSA investigation, the U.S. National Transportation Safety Board has also began probing the crash and could open its own investigation on the safety of Autopilot crash risks.
"*" indicates required fields
More Top Stories
Lawyers are working to register and file Philips CPAP lawsuits, as the manufacturer may argue the June 14 anniversary of a massive recall triggered the start of the statute of limitations in certain states
A new report indicates the U.S. Navy is struggling to process tens of thousands of Camp Lejeune water poisoning claims due to a lack of resources.
A group of plaintiffs have filed a motion with the U.S. JPML seeking consolidation of all Bard implanted port lawsuits before one judge for pretrial proceedings.