Waymo’s self-driving cars are under federal investigation after getting caught passing school bus stops in Atlanta and Austin—flashing red lights and stop arms clearly be damned. The National Highway Traffic Safety Administration is digging into whether the robo-cars ignored basic safety rules when buses were loading kids, a scenario where even distracted humans know to slam the brakes.
Atlanta schools flagged six busted bus stops involving Waymo’s autonomous fleet, including one nightmare scenario last September. Documents show one of the driverless cars rolled up on a stopped bus, paused like it was thinking things over, then just… cruised right past. No backup human, no second-guessing, just a cold-blooded bypass while kids were in the mix.
Things got messier in Austin, where cameras spotted more shady behavior. Now feds are asking the ugly question: does Waymo’s tech actually know how to handle school zones, or is it winging it?
The company admits its software screwed up and claims a fix is coming via recall. But let’s be real—this isn’t some minor glitch. We’re talking about an AI that failed what should be a no-brainer test: don’t blow past a stopped school bus.
The NHTSA’s inquiry marks the latest challenge for the autonomous vehicle industry, which has faced ongoing scrutiny over how self-driving systems handle high-risk roadway scenarios. Regulators will review technical data, assess how the vehicles interpret school bus signals, and determine whether further corrective action is necessary as the investigation proceeds.
