Tesla’s Full Self-Driving Faces Crisis: NHTSA Opens Investigation After 58 Crashes

The National Highway Traffic Safety Administration (NHTSA) has launched a new investigation into Tesla’s Full Self-Driving (FSD) system after identifying 58 incidents linked to the technology.

Published on
Read : 3 min
Tesla Model Y driven by the Full Self-Driving Beta in moderate traffic in a Los Angeles suburb
Tesla’s Full Self-Driving Faces Crisis: NHTSA Opens Investigation After 58 Crashes - © Shutterstock

Tesla’s incidents involve traffic safety violations, including running red lights and executing improper lane changes. The investigation will assess the potential safety risks posed by these actions and determine whether the system presents a significant danger to drivers and pedestrians.

The investigation is part of growing scrutiny surrounding Tesla’s Full Self-Driving system. Tesla’s driver assistance technologies, including Autopilot and Full Self-Driving, have been at the center of multiple regulatory inquiries in recent years. This new probe adds to the mounting concerns over the reliability and safety of Tesla’s autonomous driving technology.

NHTSA’s Areas of Focus

The NHTSA’s investigation will focus on two specific traffic violations that have been linked to Tesla’s Full Self-Driving system. The first involves vehicles failing to stop at red traffic lights or not stopping fully at intersections.

According to NHTSA, there have been 18 complaints and one media report alleging that Tesla vehicles with FSD engaged did not stop properly for red lights. Some incidents reportedly occurred at the same intersection in Joppa, Maryland, where local authorities got involved. Tesla issued an update to address the issue at this location, although it is unclear if the company notified NHTSA of the problem at the time, reports electrek.

The second scenario under investigation involves FSD making lane changes into opposing traffic or into the wrong lane entirely. NHTSA has identified 18 complaints, two media reports, and several reports from Tesla’s Safety Group Office (SGO) that describe instances where Tesla vehicles, with FSD engaged, crossed into opposing lanes or attempted to turn in the wrong direction. In some of these cases, FSD did not provide sufficient warning to the driver, preventing them from intervening in time to avoid the violation.

Scope of the Investigation

The NHTSA’s investigation will cover all Tesla vehicles equipped with Full Self-Driving (Supervised) or FSD Beta software, with an estimated 2,882,566 vehicles affected. The agency has identified 58 incidents in total related to these two specific types of safety violations. These incidents have resulted in 14 fires and 23 injuries. NHTSA’s investigation will examine whether these behaviors represent a broader pattern of risk and whether any corrective actions are necessary to improve the safety of the system.

The regulator is also interested in determining whether drivers had adequate time and warning to intervene in these incidents before a traffic violation occurred. While the investigation will focus on these particular safety concerns, NHTSA has indicated that it will also consider other potential issues, such as FSD’s performance near railroad crossings, as previously reported in the media.

Tesla Full Self Driving Software – © Shutterstock

Ongoing Legal and Regulatory Challenges for Tesla

Tesla is facing increasing legal and regulatory challenges related to its driver assistance systems, including the Full Self-Driving technology. In the past year, Tesla lost its first wrongful death trial, in which the company was found partially liable for a fatal crash involving its driver assistance systems. Since then, Tesla has settled two more wrongful death lawsuits connected to its ADAS technologies.

In addition to legal challenges, Tesla is also under investigation by the California Department of Motor Vehicles (DMV) for misleading customers about the capabilities of its systems. The DMV is focusing on whether Tesla’s marketing of Full Self-Driving technology has been deceptive, given the current limitations of the system. A judge is expected to decide on this case in the coming months.

Leave a Comment

Share to...