Tesla Autopilot Design Problems May Have Contributed To Accident: NTSB

Federal transportation officials suggest that problems with the design of Tesla’s “autopilot”, or self-driving, features may have been a contributing factor in a January 2018 accident, when an inattentive was permitted to disengage entirely from operating a Model S vehicle.

The National Transportation Safety Board (NTSB) released a new report this week, which indicates there is probable cause to believe the advanced driver assistance system (ADAS) failed to recognize a hazard ahead and automatically apply the brakes. This, along with driver error, likely caused the collision with a stationary fire truck, the board concluded.

In January 2018, a 2014 Tesla Model S crashed into the back of a Culver City, California fire truck on Interstate 405 near Los Angeles. The fire truck was parked diagonally over the shoulder and left lane of the highway with its emergency lights flashing while firefighters handled a different crash.

Did You Know?

AT&T Data Breach Impacts Millions of Customers

More than 73 million customers of AT&T may have had their names, addresses, phone numbers, Social Security numbers and other information released on the dark web due to a massive AT&T data breach. Lawsuits are being pursued to obtain financial compensation.

Learn More

According to the investigation, the crash occurred after a larger vehicle ahead of the Tesla moved out of its lane to avoid the upcoming fire truck. With the ADAS system engaged, the Model S reportedly began to speed up and did not recognize the parked fire truck hazard ahead, causing the two vehicle to collide at approximately 31 miles per hour. No one was injured.

The NTSB began probing the crash after the Tesla’s driver reported the autopilot system was activated, but failed to engage the brakes. He also reported there were no audible or visual indicators given to alert the driver of a hands-on warning.

Investigators say the probable cause of the rear-end collision was a combination of the driver’s lack of response to the fire engine due to inattention and over-reliance on the autopilot system, and a flaw in the system which did not recognize the hazard, engage the breaks, or warn the driver manual intervention was required.

The Center for Auto Safety, a consumer watchdog group, said on Wednesday the NTSB report should prompt the National Highway Traffic Safety Administration (NHTSA) to recall the vehicles, because the system, as described, is defective and dangerous.

NTSB officials are currently investigating at least three fatal U.S. crashes involving Tesla vehicles that were equipped with the Autopilot system.

Self-Driving System Debate

Autopilot or self-driving systems require several different safety features to work together to protect not only the driver but also surrounding vehicle occupants and pedestrians.

With as much as 94% of all roadway accidents being the result of human error, these systems aim to significantly reduce or eliminate these types of errors through features such as adaptive cruise control, pedestrian crash avoidance mitigation systems (PCAM), lane departure warning, automatic lane centering, blind spot warnings and automatic braking systems (ABS).

Although there is a major push for self-driving systems to be universal among all new vehicles, and self-driving technologies have shown to play a major role in mitigating crashes and preventing injuries over the last several years, researchers have raised concerns about the potential changes in driving behavior and the lack of basic driver safety efforts due to relying on the systems.

In an October 2018 study by the American Automobile Association (AAA), researchers indicate that at least a third of individuals using modern crash avoidance systems have become reliant upon them, and fail to follow normal driving safety procedures.

Researchers noted from survey data that while the technology is helpful to prevent crashes when a driver makes an error, crash avoidance systems have not eliminated the need for drivers to be aware of their surroundings and drive safely. Researchers reported finding some evidence that drivers are adapting to the self-driving systems in an unsafe fashion which could cause crashes, injuries or fatalities if the systems fail.


Share Your Comments

I authorize the above comments be posted on this page*

Want your comments reviewed by a lawyer?

To have an attorney review your comments and contact you about a potential case, provide your contact information below. This will not be published.

NOTE: Providing information for review by an attorney does not form an attorney-client relationship.

This field is for validation purposes and should be left unchanged.

More Top Stories

Uber Driver Sexual Assaults and Misconduct Reports Must Be Disclosed in Lawsuit Discovery
Uber Driver Sexual Assaults and Misconduct Reports Must Be Disclosed in Lawsuit Discovery (Posted today)

A federal magistrate judge is forcing Uber to hand over potentially hundreds of thousands of incident files involving reports of passengers who suffered sexual misconduct or sexual assault at the hands of the rideshare service's drivers.

Abbott May Remove Infant Formula for Preemies Off the Market Due to Similac NEC Lawsuits
Abbott May Remove Infant Formula for Preemies Off the Market Due to Similac NEC Lawsuits (Posted yesterday)

Abbott Laboratories is considering removing Similac infant formula products designed for preterm babies from the market, as it faces hundreds of lawsuits claiming the products increase the risk of necrotizing enterocolitis, which puts newborns at a high risk of permanent injuries and death.

Information About Suboxone Dental Claims To Be Exchanged By Parties in MDL
Information About Suboxone Dental Claims To Be Exchanged By Parties in MDL (Posted 2 days ago)

A federal judge has ordered parties involved in Suboxone dental decay lawsuits to submit proposals for exchanging information that will guide the selection of representative bellwether claims for early test trials.