Yazılar

US Closes Tesla Probe on Remote Driving Feature

U.S. regulators have ended an investigation into Tesla’s remote driving feature after determining it posed limited safety risk following software improvements.

The National Highway Traffic Safety Administration (NHTSA) reviewed the company’s “Actually Smart Summon” system, which allows users to move vehicles short distances via a smartphone, typically in parking environments. The probe covered approximately 2.6 million vehicles.

Authorities identified around 100 reported incidents linked to the feature. These cases largely involved low-speed collisions with stationary objects such as parked cars, garage doors or gates. No injuries, fatalities, airbag deployments or major crashes were recorded.

Regulators concluded that the frequency and severity of these incidents did not justify further enforcement action. Tesla had already deployed software updates to address identified issues, including enhancements to obstacle detection, environmental awareness and system response to dynamic conditions.

The updates also targeted limitations caused by camera obstruction factors such as snow or condensation, which had contributed to early-stage errors during feature activation.

Despite the closure of this probe, Tesla’s broader autonomous driving systems remain under scrutiny. The NHTSA recently escalated its investigation into the company’s Full Self-Driving (FSD) technology to an engineering analysis stage, covering more than 3 million vehicles and examining reports of traffic violations and crashes.

The decision underscores a regulatory approach that differentiates between low-risk driver-assistance features and more complex autonomous systems, which continue to face heightened oversight.

U.S. Safety Regulators Probe Waymo Robotaxis Over School Bus Incident

U.S. auto safety regulators have opened a preliminary investigation into Waymo, Alphabet’s self-driving car unit, after reports that one of its robotaxis failed to stop properly for a school bus in Georgia. The probe, launched by the National Highway Traffic Safety Administration (NHTSA), covers about 2,000 vehicles equipped with Waymo’s fifth-generation Automated Driving System.

The investigation follows a media report showing a Waymo vehicle maneuvering around a stopped school bus with its red lights flashing and stop arm extended while children were disembarking — a clear violation of school bus safety protocols. NHTSA said the vehicle initially stopped before moving around the bus, suggesting a potential software or perception failure.

Regulators noted that given Waymo’s extensive operations — the company’s autonomous cars have logged over 100 million miles and currently drive 2 million miles per week — similar incidents could have occurred previously. The agency emphasized the need to evaluate how Waymo’s technology responds to critical real-world safety cues, particularly around children and pedestrians.

Waymo acknowledged the event, saying it has already implemented software improvements to enhance behavior around school buses and will issue further updates soon. “Driving safely around children has always been one of our highest priorities,” a company spokesperson said, explaining that the vehicle’s sensors may not have initially detected the flashing signals due to its angle of approach.

The company operates a fleet of over 1,500 driverless vehicles in Phoenix, San Francisco, Los Angeles, and Austin. The new probe comes months after NHTSA closed another 14-month investigation into Waymo’s earlier collisions with stationary objects, which led to two vehicle recalls.

U.S. investigates 2.9 million Teslas over Full Self-Driving traffic violations

The U.S. National Highway Traffic Safety Administration (NHTSA) has launched an investigation into 2.88 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) software after receiving more than 50 reports of traffic violations and crashes linked to the system.

The agency said the FSD feature — which requires driver attention and intervention — has in some cases “induced vehicle behavior that violated traffic safety laws,” including driving through red lights and making illegal lane changes. So far, 58 incidents have been reported, 14 resulting in crashes and 23 injuries, according to NHTSA.

In at least six cases, Teslas running FSD reportedly entered intersections against red signals, leading to collisions, four of which caused injuries. The regulator said it is also examining FSD’s behavior at railroad crossings following concerns raised by U.S. lawmakers over near-miss incidents.

The probe marks a preliminary evaluation, the first stage before a potential vehicle recall if safety risks are confirmed. Tesla shares slipped 2.1% following news of the investigation, first reported by Reuters.

Tesla recently issued a software update for FSD, though the company has not publicly commented on the probe. The system has been under continuous federal scrutiny amid concerns that its branding and performance blur the line between driver assistance and full automation.

Experts say the U.S. action may pressure other regulators to examine the growing use of semi-autonomous technologies in vehicles worldwide.