Tesla Autopilot Accidents With Stopped Emergency Vehicles Lead to NHTSA Investigation
Federal highway safety officials have launched an investigation into potential problems with Tesla autopilot systems, following nearly a dozen accidents involving crashes with stopped vehicles or equipment at first responder scenes.
The National Highway Traffic Safety Administration (NHTSA) announced the Tesla autopilot investigation on August 13, outlining at least 17 injuries and one death linked to eleven separate crashes involving a Tesla electric vehicle veering off a roadway and striking other vehicles at emergency first responder scenes while the Autopilot and Traffic Aware Cruise Control were activated.
According to the NHTSA Office of Defects Investigation (ODI), most of the incidents occurred at night and all involved scene control measures, such as first responder vehicle lights, flares, an illuminated arrow board, and road cones, which could disrupt the auto-pilot functionality.
Did You Know? Millions of Philips CPAP Machines Recalled
Philips DreamStation, CPAP and BiPAP machines sold in recent years may pose a risk of cancer, lung damage and other injuries.Learn More
The investigation will review the safety and effectiveness of the Tesla autopilot technology used in approximately 765,000 Tesla Model 3, Model S, Model X, and Model Y electric vehicles.
Autopilot is a rapidly evolving Advanced Driver Assistance System (ADAS), which consists of many simultaneously working systems to steer, accelerate and apply the brakes of a vehicle automatically within its lane. While many experts believe these technologies can save thousands of lives, by avoiding driver error that causes nearly all accident. However, many of the technologies are new and may be prone to unknown programming errors.
ODI specifically states that it will begin probing the Advanced Driver Assistance System (ADAS) which makes up Tesla’s Autopilot, with a focus on its Object and Event Detection and Response (OEDR) functionality.
Officials announced they will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation to determine the contributing circumstances which has allegedly caused the eleven reported crashes.
One of the higher profile Tesla autopilot crashes that will be analyzed during the investigation occurred in January 2018, when a 2014 Tesla Model S crashed into the back of a Culver City, California fire truck on Interstate 405 near Los Angeles.
According to an initial NTSB investigation, the fire truck was parked diagonally over the shoulder and left lane of the highway with its emergency lights flashing while firefighters handled a different crash. The Model S reportedly collided into the fire truck after a larger vehicle ahead of the Tesla moved out of its lane to avoid the upcoming emergency vehicle. Officials reported with the ADAS system engaged, the Model S reportedly began to speed up and did not recognize the parked fire truck hazard ahead, causing the two vehicle to collide at approximately 31 miles per hour.
Self-Driving Car Accidents
Autopilot or self-driving systems require several different safety features to work together to protect not only the driver, but also surrounding vehicle occupants and pedestrians.
With as much as 94% of all roadway accidents being the result of human error, these systems aim to significantly reduce or eliminate these types of errors through features such as adaptive cruise control, pedestrian crash avoidance mitigation systems (PCAM), lane departure warning, automatic lane centering, blind spot warnings and automatic braking systems (ABS).
Although there is a major push for self-driving systems to be universal among all new vehicles, and self-driving technologies have shown they can play a major role in mitigating crashes and preventing injuries over the last several years, researchers have raised concerns about the potential changes in driving behavior and the lack of basic driver safety efforts due to reliance on these systems.
More Top Stories
3M has been ordered to pay an Army veteran $77.5 million, with the majority being paid as punitive damages to punish the company for reckless behavior.
A wrongful death lawsuit filed by the parents of a baby girl who died at only five days old says her fatal case of necrotizing enterocolitis was caused by Similac and Enfamil infant formula products.
A Nebraska man says he developed Parkinson's disease while working as a licensed applicator of Paraquat for nearly 20 years.