An investigation opened by the National Highway Traffic and Safety Administration (NHTSA) alleges traffic safety violations committed by Waymo driverless taxis around a stopped school bus, specifically “when the bus [was] boarding of offboarding students.” Evidently, that’s a dangerous time for a traffic infraction to occur. The incidents are, according to the report, spread across about 2,000 Waymo driverless taxis in the Atlanta, Georgia area.
Waymo Taxis Struggle To Follow Some Traffic Laws

Waymo’s taxis in particular get a lot of publicity for this problem, but other autonomous rideshare vehicles encounter similar issues. Often, the vehicles are not willfully disobeying traffic laws – it’s a result of their programming. However, legal entities have frequently struggled with holding someone responsible. How can an officer issue a ticket to a Waymo taxi making an illegal U-turn when there is no driver? In California, Waymo will be responsible for paying infractions starting next year, but the legal framework for these companies is still fractured.
Waymos Aren’t Stopping For School Buses

Enter: The NHTSA’s new investigation. The Office of Defects Investigation probe follows a media report alleging a Waymo taxi failed to remain stopped when it came across a parked school bus in Atlanta on September 22, 2025. The taxi initially stopped beside the bus before driving around the front of the bus and back along the opposite side of the bus. Students were exiting the bus at the time, and the bus had its stop sign and lights on. Across the US, it is illegal for vehicles to pass a stopped school bus due to the high risk of a collision with embarking or disembarking kids.
Waymo told Car and Driver that the company had made updates to the taxi’s software to attempt to aid its ability to recognize a stopped school bus. The spokesperson defended the car’s actions, saying the bus was blocking a driveway the taxi was attempting to exit, and that its lights and stop sign were not visible to the car during its manouver.
The incident presents an issue that’s become commonplace with autonomous vehicles. “Self-driving” taxis like Waymo’s Jaguar I-Pace crossovers are developed in the real world, in real time. In other words, a bandage is only applied after a wound already exists. Companies cannot possibly hope to program a vehicle for all contingencies, and so an ex post facto remedy is the best they can hope for in instances like this one. In some cases, the penalty for this way of doing business may one day be injury or worse. As a result, Waymo vehicles in particular have been the target of public intervention and outcry.
Â