WASHINGTON, October 9, 2025: U.S. auto safety regulators have opened a formal investigation into nearly 2.9 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) system following reports of traffic violations and multiple crashes involving the software. The National Highway Traffic Safety Administration (NHTSA) confirmed Thursday that it has launched a preliminary evaluation of the advanced driver assistance system after receiving 58 complaints. The reports allege incidents in which vehicles operating with FSD committed traffic violations such as failing to yield, running red lights, and navigating unsafely through intersections.

The probe covers Tesla models that include the system across the United States. According to documents released by the agency, at least 14 of the reported incidents involved collisions, and 23 people were injured. Six of those crashes occurred after Tesla vehicles entered intersections while the traffic light was red, in several cases resulting in impacts with other vehicles. Four of the red-light-related incidents led to injuries. The review will examine the system’s decision-making, especially in scenarios involving intersections and complex traffic controls.
NHTSA stated that the investigation aims to determine whether Tesla’s FSD feature “undermines driver adherence to traffic safety laws” when it is engaged. While Tesla markets FSD as an advanced driver-assistance system requiring driver supervision, the agency is examining whether its operation could lead to increased risk of traffic law violations or crash involvement. The probe affects an estimated 2.88 million Tesla vehicles in the United States. The affected models include the Model 3, Model Y, Model S, and Model X that have received FSD software updates.
Tesla faces another US probe over automated driving software
NHTSA said it will assess the performance and limitations of the software, including how drivers interact with it, and whether a defect exists that requires corrective action. This marks the latest in a series of regulatory actions targeting Tesla’s driver assistance systems. In February 2024, NHTSA opened a separate investigation into Tesla’s remote parking feature, known as “Smart Summon,” following complaints about unexpected vehicle movements. In late 2023, the agency launched an inquiry into Tesla’s Autopilot and FSD systems after a series of high-profile crashes in foggy and low-light conditions, including one fatal incident.
NHTSA has the authority to request a recall if it determines a safety defect exists. In December 2023, Tesla issued an over-the-air update to address concerns raised in an earlier NHTSA probe into Autopilot-related crashes. The company has not yet commented on the latest investigation. The announcement had an immediate impact on financial markets. Tesla shares fell nearly 2 percent in intraday trading Thursday following disclosure of the investigation. The company, which has been promoting its FSD technology as central to its future product offerings, has not responded to requests for comment from regulators or media as of publication.
Timeline for preliminary investigation remains unspecified
Tesla’s driver assistance systems have faced increasing scrutiny from federal regulators and safety advocates. Although the company states that FSD does not make vehicles fully autonomous, critics have argued that the branding and user interface could lead to driver over-reliance on the system. The software requires drivers to remain attentive and keep their hands on the wheel at all times. The current investigation remains in its early stages. NHTSA has not set a timeline for the preliminary evaluation. Depending on its findings, the agency may escalate the probe into an engineering analysis, which is the next step before any recall decision is made.
Tesla has previously stated that safety is a core focus of its driver assistance technologies and that its systems are designed to reduce the likelihood of accidents. The company continues to roll out updates to FSD software via over-the-air downloads to vehicle owners. Tesla maintains that driver supervision is required at all times when using FSD and emphasizes that the system does not make vehicles autonomous. The company asserts that ongoing software improvements are aimed at enhancing performance and compliance with road safety standards. – By Content Syndication Services.
