Tesla’s Full Self-Driving Goes Rogue, Tries to Drive Owner Into a Lake, Video Shows

Tesla’s Full Self-Driving attempts to drive an owner into a lake, sparking viral concern and intensifying scrutiny over the system’s safety and readiness.

Published on
Read : 3 min
Tesla’s Full Self-Driving Goes Rogue, Tries to Drive Owner Into a Lake, Video Shows - © X

Daniel Milligan, the owner behind the viral video, shared his unsettling experience on Twitter, revealing that his Tesla, running the latest version of FSD (14.2.2.4), veered dangerously toward a lake while in use.

Tesla’s Full Self-Driving software, marketed as a cutting-edge solution for autonomous driving, has been a central point of controversy. While Tesla continues to roll out updates to improve the system, real-world incidents like this suggest it is far from the fully autonomous driving experience promised by the company.

FSD’s Dangerous Edge Cases Continue to Raise Concerns

The recent incident is far from an isolated case. Tesla’s FSD system has been linked to a series of dangerous events, including high-profile crashes and near-misses. In May 2025, a Tesla on FSD flipped upside down after veering off the road, an event the driver claimed they could not prevent.

Similarly, a crash in China in December 2025 saw a Tesla collide with another vehicle after the FSD system initiated a lane change into oncoming traffic during a livestream. These incidents are part of a troubling pattern of malfunctions that continue to raise serious questions about the system’s readiness for full deployment, reports Electrek.

Tesla has also faced criticism over the functionality of FSD’s latest build, version 14.2.2.4, which was released in January 2026. The update, touted as a polished version of its predecessor, did not include any major changes or fixes to prevent dangerous edge cases.

While it did introduce improvements to the vehicle’s neural network for higher-resolution vision, the system’s ability to handle emergency vehicles and avoid high-risk situations remains limited. This continues to undermine confidence in the technology, with critics arguing that FSD’s capabilities are far below what Tesla has led customers to expect.

Regulatory Scrutiny Grows as FSD Incidents Multiply

Tesla’s Full Self-Driving system is now under heightened scrutiny from regulators, who are increasingly concerned about its safety. In October 2025, the National Highway Traffic Safety Administration (NHTSA) launched an investigation into 2.88 million Tesla vehicles, linking 58 incidents to FSD, including crashes and injuries.

Among the most troubling issues were reports of FSD running red lights and driving into oncoming traffic. With over 50 deaths associated with Tesla’s driver-assistance systems, including both Autopilot and FSD, the pressure is mounting for the company to address these safety concerns.

The regulatory landscape is further complicated by Tesla’s handling of incident reporting. NHTSA has also opened an investigation into Tesla’s failure to report FSD-related crashes in a timely manner. As the company faces growing public and regulatory pressure, questions continue to swirl about the company’s approach to fully autonomous driving. Tesla’s shift to a subscription-only model for FSD, announced in February 2026, further complicates matters, signaling that the system is still a work in progress, despite years of marketing promises about its capabilities.

A System Still Far from Fully Autonomous

Despite Tesla’s ambitious promises, the FSD system remains a far cry from the fully autonomous technology that has been marketed for years. The lake incident highlights the gap between what Tesla has sold customers and the reality of the technology’s limitations. While FSD has made strides in areas such as lane-keeping and traffic-aware cruise control, incidents like these show that the system is still prone to critical failures that could have deadly consequences.

In its current form, FSD should not be considered “full self-driving” by any reasonable definition. The software’s tendency to make life-threatening errors, combined with the company’s ongoing struggle to deliver on its promises, calls into question the wisdom of continuing to market it as fully autonomous.

Leave a Comment

Share to...