After complications in multiple cities when self-driving taxis failed to stop for school buses, the NTSB joins NHTSA in a probe to determine what's behind the tech and related safety concerns.
Recent federal actions involving Tesla and Waymo are intensifying national debate over the safety, oversight, and transparency of autonomous and semi-autonomous vehicles, especially around school buses.
The Witherite Law Group and others have raised concerns about new federal regulatory rollbacks that weaken transparency in autonomous vehicle safety reporting and their potentially negative effect on schoolchildren.
The U.S. Department of Transportation recently announced changes to self-driving crash-reporting rules for "Level 2" driver-assist systems like Tesla's autopilot, removing the requirement to report incidents with no injuries or fatalities.
According to an October release, critics warned that this reduction in transparency could make it more difficult to detect trends in system malfunctions and obscure the true frequency of crashes. However, these changes could benefit Tesla, which accounted for over 800 of 1,040 reported self-driving crashes last year.
Transportation Secretary Sean Duffy defended the decision as an efficiency measure to boost competitiveness, but safety advocates argue it compromises public safety and accountability.
Background: Waymo Faces Investigation Over Multiple School Bus Passings
Last fall, the National Highway Traffic Safety Administration (NHTSA) opened a preliminary investigation into approximately 2,000 Waymo driverless vehicles after reports that one failed to stop for a school bus displaying flashing lights and an extended stop arm.
The vehicle, which was operating without a human driver, reportedly maneuvered around the front of the bus as children exited.
In October, Waymo stated it has since improved its systems, emphasizing that "driving safely around children has always been one of Waymo's highest priorities."
On Dec. 5, Waymo said it would issue a recall for its self-driving vehicles after Texas officials said they illegally passed school buses at least 19 times since the start of the school year.
The NHTSA website includes a letter from the Austin Independent School District citing incidents of Waymo vehicles "illegally and dangerously" passing the district's school buses. The letter states, "Five of these violations occurred after Waymo's Nov. 5, 2025, letter reassuring Austin ISD that certain software updates were in place to resolve the issue."
An incident report from the district details a Waymo automated vehicle driving past a stopped school bus "moments after" a student crossed in front of the vehicle, while the student was still in the road.
In the letter, Austin ISD demanded that Waymo immediately cease operations of its automated vehicles during pickup and drop-off hours until more in-depth software updates are completed.
Austin ISD officials say the violations continued even after Waymo deployed its software update, and operations have not been paused despite the concerns.
ACT News reported that similar incidents were reported in Atlanta, where an 11Alive investigation found at least six cases of Waymo vehicles passing stopped Atlanta Public Schools buses.
NTSB Launches New Waymo Probe
Late last week, the National Transportation Safety Board (NTSB) also opened an investigation into Waymo based on the incidents in Austin.
“Investigators will travel to Austin to gather information on a series of incidents in which the automated vehicles failed to stop for loading or unloading students,” the NTSB said to TechCrunch.
A preliminary report is expected within 30 days, with a more detailed report in one to two years.
Teamsters Reacts
In a recent press release, labor union Teamsters spoke out about the Waymo investigation.
"This incident is emblematic of the broader goal Big Tech companies have to replace skilled human labor with AI," wrote Peter Finn and Victor Mineros, co-chairs of Teamsters California. "They want to force millions of people into destitution by destroying their livelihoods, seize money that belongs to workers, and force our communities to reckon with the fallout of automation's shortcomings. Robotaxis threaten workers' jobs and are now terrorizing our kids.
"Waymo vehicles have continued to illegally ignore school bus stop signs despite a company-wide software recall and another, separate NHTSA investigation. Parents, teachers, school workers, and community members have been demanding that these vehicles be kept away from school zones. Waymo and its parent company, Google, choose to ignore those warnings."
In response, Teamsters are calling on the California Public Utilities Commission (CPUC) to indefinitely suspend Waymo's license to operate in California.
Attorney Warns Autonomous Vehicles Are "Out of Control"
These incidents show that the rush to deploy autonomous and semi-autonomous vehicles continues to outpace safety and accountability standards, said Amy Witherite, founder of Witherite Law Group, last October. "Transparency, rigorous testing, and consistent enforcement must keep pace with innovation — or we risk replacing one set of dangers with another."
In December, Witherite issued an urgent warning that autonomous vehicle companies are “out of control” following Waymo’s reported refusal to curtail service in Austin even after video evidence was shared.
On Dec. 5, CBS Austin reported that Travis Pickford, Austin ISD’s assistant chief of police, said Waymo "disagreed with our risk assessment and have refused to cease operations."
“If a human driver broke the law 20 times around school buses, their license would be suspended immediately,” Witherite said. “Waymo’s response was simply: ‘We’re not stopping.’ That is not accountability. That is a corporation telling a school district to live with the danger.”
According to Witherite, the repeated school-bus violations in Austin are part of a broader pattern of autonomous system failures documented nationwide, which they note as:
- A Waymo robotaxi driving into an active police stop in Los Angeles, coming within feet of a prone suspect as officers shouted commands at the vehicle.
- Erratic driving behavior reported in multiple cities, including illegal turns, rolling stops, and confusion at intersections.
- Collisions involving pets and pedestrians, including multiple incidents in San Francisco and Phoenix.
- Documented struggles navigating school zones, emergency scenes, and unpredictable pedestrian behavior areas where safety margins must be highest.
“Frankly, this is the wild west when it comes to autonomous vehicles,” said Captain Matt McElearney of the Austin Fire Department’s AV Safety Working Group. “There is a lot of leeway that they have, and enforcement is very limited on our end.”
Waymo's Response
Despite its software recall, Waymo stands by its vehicle technology, and has not backed off service in Austin or other cities.
In a statement to NPR, Waymo Chief Safety Officer Mauricio Peña said that while the company is proud of its safety record, "holding the highest safety standards means recognizing when our behavior should be better."
Just before Christmas, Waymo's San Francisco fleet seemingly glitched when the city experienced a power outage, causing its cars to stop in the middle of roads and cause traffic jams. In response, it did suspend service for a brief period.
The company continues to allege that it is "confident that our safety performance around school buses is superior to human drivers."
Waymo has recently launched operations in additional U.S. cities, including Miami, Pittsburgh, and St. Louis.
Editor's Note: This article was originally published on October 22, 2025, and was updated on December 9 and 11, 2025, to include updates on autonomous vehicles and illegal school bus passing. It was again updated on January 26, 2026, when NTSB announced its investigation. Follow along for more updates.