Crashes, red lights, and wrong turns; Tesla’s self-driving system is under US federal scrutiny after 44 alarming incidents raised fresh safety concerns.
The US National Highway Traffic Safety Administration (NHTSA) has launched a federal investigation into Tesla’s Full Self-Driving (Supervised) system, known as FSD, over potential safety defects. The probe follows reports that the technology may have caused collisions and traffic violations.
According to NHTSA, 44 separate incidents have been linked to Tesla vehicles using FSD. Drivers claimed the system ran red lights, veered into oncoming traffic, or made other hazardous moves that led to crashes, some of which resulted in injuries. The investigation covers all Tesla models equipped with FSD or its earlier FSD (Beta) version; an estimated 2.88 million vehicles.
Although Tesla markets FSD as a step towards autonomy, the company instructs drivers to remain alert and ready to take control at any moment. The agency’s Office of Defects Investigation has opened a Preliminary Evaluation to determine whether drivers had sufficient warning or time to respond to unexpected vehicle behaviour.
The review will also assess how FSD recognises and reacts to traffic signals, lane markings, and road signs, as well as how it alerts drivers about its intended actions.
Tesla has yet to comment on the investigation. The company recently rolled out FSD version 14.1 to customers.
Tesla CEO Elon Musk has long promoted FSD as a pathway to fully autonomous ‘robotaxi’ capabilities. However, despite repeated assurances, that vision remains unrealised. Tesla has since said future upgrades will need both new hardware and software.
The company is currently trialling a Robotaxi-branded ride-hailing service in Texas and other locations, but the vehicles still include human safety drivers.
The probe comes amid reports that staff cuts earlier this year at NHTSA, under directives from Musk and former President Donald Trump, may have affected the agency’s capacity to oversee autonomous vehicle safety.























