Tesla Autopilot Accident Blamed On Technology Failure and Lack of Regulatory Oversight: NTSB
Following the investigation of a fatal 2018 crash involving a Tesla Model X that was being operated with “autopilot” features engaged, federal safety officials have issued a scathing report that indicates the accident was caused by a combination of the electric car maker’s design of the autonomous driving system, together with a lack of appropriate regulatory oversight for emerging hands-free driving features.
The National Transportation Safety Board (NTSB) released a series of safety recommendations on February 25, which determined that the Tesla autopilot accident was probably caused by limitations of the technology, among other contributing factors.
The NTSB reviewed a more than year-long investigation into the March 23, 2018 crash, which killed the 38-year-old driver, who suffered multiple blunt-force injuries after his 2017 Tesla Model X P100D electric SUV entered an exit ramp at US-101 and State Route 85 and struck a damaged and nonoperational crash attenuator at 71 mph.
Did You Know?
Millions of Philips CPAP Machines Recalled
Philips DreamStation, CPAP and BiPAP machines sold in recent years may pose a risk of cancer, lung damage and other injuries.Learn More
According to the investigation, at the time of the accident the driver was playing a game on his iPhone while utilizing Tesla’s semi-autonomous Autopilot system. After reviewing the “Carlog” data, officials determined Tesla’s autopilot was not able to recognize the concrete barrier ahead within 375 feet, and began increasing speed from 61.9 mph to the preset cruise speed of 75 mph before impact.
NTSB Chairman Robert Sumwalt indicated that Tesla’s forward collision warning system failed to warn the driver to manually take control, nor did the automatic emergency braking activate to prevent the head-on collision. Immediately following the impact, officials reported the Tesla high-voltage battery was breached and engulfed the remains of the vehicle in fire.
“In this crash we saw an overreliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures that, when combined, led to this tragic loss,” Sumwalt said in the press release. “The lessons learned from this investigation are as much about people as they are about the limitations of emerging technologies.”
The board faulted several parties for the incident, including Tesla, the National Highway Traffic Safety Administration (NHTSA), Apple Inc. and California’s highway agency CalTrans.
Tesla was criticized for its misleading marketing of its “self-driving” vehicle claims, when really the vehicles are only equipped with an advanced driver assistance system. Sumwalt indicated the misleading claims have caused drivers to become over-reliant on the vehicles’ ability, which has been shown in at least 14 Tesla-related autopilot failures to cause potentially fatal crashes.
Sumwalt notes that there are no vehicles currently available to U.S. consumers which can claim to be “self driving” and warned consumers that if they are driving a car with an advanced driver assistance system, they do not own a self-driving car.
The NTSB blamed the NHTSA for not providing effective oversight of Tesla’s “level 2” driver assistance program. The board also criticized CalTrans, which failed to replace a damaged crash attenuator in front of the concrete gore, which would most likely have saved the driver’s life.
The NTSB released nine safety recommendations to the NHTSA, the Occupational Safety and Health Administration, SAE International, Apple Inc., and other manufacturers of portable electronic devices.
Those recommendations consisted of expanding the NHTSA’s New Car Assessment Program to test forward collision avoidance systems, a collaborative development of standards for driver monitoring systems to ensure drivers do not disengage and misuse autopilot systems, and an evaluation of Tesla’s Autopilot equipped vehicles to determine the system’s operating limitations.
The NTSB also specifically called on Apple Inc. for the development of distracted driving lock-out mechanisms or an application which automatically disables portable-electronic devices when the vehicle is in motion.
Self-Driving Car Accidents
Autopilot or self-driving systems require several different safety features to work together to protect not only the driver but also surrounding vehicle occupants and pedestrians.
With as much as 94% of all roadway accidents being the result of human error, these systems aim to significantly reduce or eliminate these types of errors through features such as adaptive cruise control, pedestrian crash avoidance mitigation systems (PCAM), lane departure warning, automatic lane centering, blind spot warnings and automatic braking systems (ABS).
Although there is a major push for self-driving systems to be universal among all new vehicles, and self-driving technologies have shown to play a major role in mitigating crashes and preventing injuries over the last several years, researchers have raised concerns about the potential changes in driving behavior and the lack of basic driver safety efforts due to relying on the systems.
"*" indicates required fields
More Top Stories
A ProPublica report reveals that Philips officials hid thousands of reports of problems with sound abatement foam used in millions of CPAP machines, failing to recall the devices for more than a decade after receiving the first complaints.
A Suboxone lawsuit claims the opioid addiction treatment's dental side effects can lead to severe tooth damage and decay.
The FDA is requiring new label warnings to alert patients and doctors to the risk of Ozempic intestinal blockage side effects.