Tesla “Smart Summon” Features Under Investigation by NHTSA
Federal officials have launched an investigation into the safety of an estimated 2.6 million Tesla vehicles, following reports of accidents that may have resulted from defects in the electric car’s “Smart Summon” and “Actually Smart Summon” driverless features.
Tesla promotes “Smart Summon,” and a newer variation known as “Actually Smart Summon”, as part of the company’s advanced driver-assistance system known as Full-Self Driving (FSD). These features are designed to enhance convenience by enabling Tesla owners to control their vehicles remotely using a smartphone app.
The Smart Summon feature was first introduced in 2014, allowing the vehicle to navigate short distances autonomously, such as from a parking spot to a pickup area, without the driver inside. The system utilizes sensors, cameras and software to detect obstacles, pedestrians and other vehicles, while maneuvering through parking lots or private driveways.
The Actually Smart Summon feature builds on this functionality by offering improved navigation capabilities, potentially allowing the vehicle to handle more complex parking lot layouts and obstructions. Both systems are intended to eliminate the need for drivers to physically retrieve their vehicles in certain low-speed, low-risk scenarios.
However, reports of malfunctions and crashes have raised concerns about the reliability and safety of these technologies, prompting regulatory scrutiny and the investigation by federal safety officials.
The National Highway Traffic Safety Administration (NHTSA) launched a Tesla investigation on January 6, after identifying several incidents where vehicles using the Smart Summon or Actually Smart Summon features failed to detect obstacles such as posts or parked cars. These failures led to collisions, posing risks of property damage and potential injuries.
Do You Know About...
Childhood Diabetes Lawsuits Against Junk Food Industry
Lawyers are now pursing financial compensation for families of children diagnosed with Type II diabetes, fatty liver disease and other chronic illnesses caused by addictive and harmful substances in ultra-processed foods.
Learn MoreAccording to the investigation, NHTSA’s Office of Defects Investigation (ODI) has received one complaint and three media reports of crashes involving the Actually Smart Summon feature, as well as 12 accident allegations involving vehicles using the Smart Summon feature.
Multiple crash allegations report that users had too little time to react to avoid the crash, either due to limited line of sight or by releasing the phone app button, which stops the vehicle’s movement.
All Tesla vehicles equipped with the Actually Smart Summon or Smart Summon features are part of the investigation, including the 2017 through 2025 Model 3, 2016 through 2025 Model S and Model X, as well as the 2020 through 2025 Model Y.
The Office of Defects Investigation will assess the performance and capabilities of the Actually Smart Summon feature in light of reported crashes, focusing on its operational behavior and real-world performance. As part of the investigation, the National Highway Traffic Safety Administration (NHTSA) will examine the maximum speed the vehicles can reach while using Actually Smart Summon, the feature’s designed operating limitations for public road use and its line-of-sight requirements.
NHTSA expects the investigation to include a review of remote vehicle control through the phone app at various distances and lines of sight, including app connectivity delays that result in increased stopping distances, as well as the use of Actually Smart Summon on roadways for which the current version of the system is not intended or designed.
Tesla has not reported any accidents related to the Summon features under the Standing General Order for crashes involving an automated driving system (ADS), which mandates the reporting of crashes on publicly accessible roads.
Tesla Self-Driving Concerns
Tesla vehicles have been involved in numerous investigations and lawsuits over the past year, including several issues related to their self-driving features.
A previous Tesla self-driving technology investigation was launched in October 2024 following four pedestrian accidents, including one fatality, involving Tesla vehicles using Full Self-Driving features in areas with reduced roadway visibility, such as conditions caused by sun glare, fog or airborne dust.
In December 2024, a wrongful death lawsuit was filed against Tesla, alleging that the company misrepresented the safety of its Autopilot feature. The lawsuit claimed that Tesla portrayed Autopilot as a safe and fully autonomous driving system, which led to the death of a 33-year-old man after his Tesla collided with a fire truck. The family alleged that this misrepresentation caused the man to trust the Autopilot feature, which failed to detect the emergency vehicle.
These ongoing investigations and legal challenges highlight growing concerns about the safety of Tesla’s self-driving technology systems.
0 Comments