Tesla Self-Driving Safety Violations Will Be Investigated by NHTSA

Tesla Self-Driving Safety Violations To Be Investigated by NHTSA

An investigation has been launched into the safety of nearly 2.9 million Tesla vehicles, following reports of accidents and injuries involving the company’s full self-driving (FSD) features.

Tesla’s FSD feature is a driver-assistance technology designed to navigate and drive the vehicle with minimal human input. These features include automatic lane changes, traffic light and stop sign control, navigate on Autopilot, and the ability to park or summon the car.

However, the U.S. National Highway Traffic Safety Administration (NHTSA) launched a Tesla investigation on October 7, after identifying multiple incidents where vehicles broke traffic laws, such as running red lights and changing lanes into oncoming traffic.

These violations risk the safety of drivers, passengers, other vehicles and pedestrians, and could lead to incidents of severe injury or death, depending on the severity of the infraction.

Sports-Betting-Addiction-Lawsuits
Sports-Betting-Addiction-Lawsuits

The current investigation involves 2,882,566 Tesla vehicles equipped with FSD (Supervised) or FSD (Beta), which have caused at least 44 separate incidents, including three crashes and five injuries.

This is not the first investigation into Tesla’s self-driving features. Previous probes have examined collisions, as well as a fatal injury involving the company’s self-driving technologies. A separate investigation into Tesla’s Smart Summon feature led to multiple accident reports.

However, in this instance, some media reports claim that the vehicles did not provide a warning of the system’s intended behavior, leaving drivers with inadequate time to respond.

According to the Office of Defects Investigation (ODI) Resume, a preliminary evaluation is being opened to assess the scope, frequency and potential safety consequences of full self-driving features that have violated traffic laws. 

Since Tesla states that the driver is fully responsible for operating their vehicle at all times, the current investigation focuses on whether certain self-driving actions interfere with the driver’s supervision when they occur unexpectedly.

These incidents include 18 complaints and one media report alleging that a Tesla failed to remain stopped for the full duration of a red light traffic signal, did not stop completely, or failed to accurately detect and display the correct traffic signal state in the vehicle interface.

Six reports have also been identified incidents where Tesla vehicles approached intersections with red traffic signals and proceeded into the intersection, resulting in crashes with other vehicles. Four of these crashes caused one or more reported injuries.

NHTSA has also identified two manufacturer reports, 18 complaints and two media reports related to Tesla vehicles crossing into the opposing lane during or after a turn, crossing double-yellow lane markings while driving straight, or attempting to turn onto a road in the wrong direction. 

Additionally, the agency has found four manufacturer reports, several complaints and one media report alleging the vehicle either proceeded straight through an intersection in a turn-only lane or made a turn from a through lane.

The current review will assess if drivers received warnings with enough time to respond to unexpected self-driving behavior, including how well FSD mode detects traffic signals, lane markings and wrong-way signs, as well as the impact of system updates on safety compliance.

Tesla Autopilot Lawsuits

The NHTSA investigation comes amid a series of lawsuits filed against Tesla regarding accidents and deaths involving the vehicles’ full self-driving features.

Last year, a family filed a wrongful death lawsuit against Tesla, alleging the company misrepresented the safety of its Autopilot feature, which they claim contributed to a man’s death after crashing into the back of a fire truck.

A Florida jury also awarded $329 million in damages to a man and the family of his deceased girlfriend only a few months ago, after declaring Tesla’s Autopilot system partially responsible for her fatal crash.

Sign up for more safety and legal news that could affect you or your family.

Image Credit: Taljat David / Shutterstock.com

Written By: Darian Hauf

Consumer Safety & Recall News Writer

Darian Hauf is a consumer safety writer at AboutLawsuits.com, where she covers product recalls, public health alerts, and regulatory updates from agencies like the FDA and CPSC. She contributes research and reporting support on emerging safety concerns affecting households and consumers nationwide.




0 Comments


This field is for validation purposes and should be left unchanged.

Share Your Comments

This field is hidden when viewing the form
I authorize the above comments be posted on this page
Post Comment
Weekly Digest Opt-In

Want your comments reviewed by a lawyer?

To have an attorney review your comments and contact you about a potential case, provide your contact information below. This will not be published.

NOTE: Providing information for review by an attorney does not form an attorney-client relationship.

MORE TOP STORIES

A federal judge has called for a second census of Suboxone tooth decay lawsuits and will require prompt filing of census forms for claims filed from October 1 forward.
A new lawsuit against Roblox alleges that the platform’s inadequate safety measures enabled multiple sexual predators to exploit a five-year-old girl.
Breast mesh implants promoted as internal bras are now under scrutiny, following studies and FDA warnings linking the devices to infections, implant loss, and surgical failure. Lawsuits are being investigated for women who suffered complications after reconstruction or augmentation procedures involving products like GalaFLEX, Phasix, Strattice, and AlloDerm.