Federal regulators are investigating the safety of autonomous vehicles operated by Waymo, the autonomous-vehicle subsidiary of Alphabet Inc., after one of the self-driving vehicles struck a child near an elementary school.
The U.S. National Highway Traffic Safety Administration said in documents posted to its website on Jan. 29 that its Office of Defects Investigation (ODI) has opened a preliminary investigation into Waymo’s self-driving cars after the company reported an accident involving a vehicle that struck a child in a Jan. 23 crash that “occurred within two blocks of a Santa Monica, California, elementary school during normal school drop off hours.”
NHTSA said there were other children, a crossing guard and several double-parked vehicles in the vicinity of the crash. The child who was struck by the Waymo self-driving car ran across the street from behind a double-parked SUV toward the school. Waymo reported that the child sustained minor injuries, according to NHTSA. The agency also said no safety operator was present in the vehicle.
NHTSA said its defect office opened the new probe “to investigate whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
The ODI office said it “expects that its investigation will examine the (Automated Driving System)’s intended behavior in school zones and neighboring areas, especially during normal school pick up/drop off times, including but not limited to its adherence to posted speed limits.” ODI said it will also investigate Waymo’s response after the incident.
Waymo said in a statement it is “committed to improving road safety, both for our riders and all those with whom we share the road.” The company said it voluntarily contacted the National Highway Traffic Safety Administration the same day as the crash and plans to cooperate with the federal probe.
The National Highway Traffic Safety Administration identifies six levels of autonomous driving.
Waymos are categorized on NHTSA’s six-point autonomy scale as Level 4 vehicles, which are considered highly autonomous. The agency says of Level 4 systems: “When engaged, the system handles all driving tasks while you, now the passenger, are not needed to maneuver the vehicle. The system can only operate the vehicle in limited service areas, not universally. A human driver is not needed to operate the vehicle.”
In case you missed it: I rode in a fully autonomous vehicle and thought I’d be cool about it. I wasn’t.
NHTSA is careful to note that cars with Level 4 technology are not available for consumers to purchase, but they are used by ride-hailing services in several U.S. cities.
The agency previously opened a preliminary evaluation of Waymo’s self-driving cars in May 2024 after receiving reports that the company’s vehicles crashed into objects, including gates, chains and parked cars. The agency also cited instances in which the company’s automated-driving system appeared to disobey traffic-control devices. The investigation was later closed by regulators in July 2025.
Michael Brooks, executive director of the Washington-based Center for Auto Safety, which advocates for stringent car regulations, said “Waymo has also struggled to correct recent safety failures involving school buses, and continues to ignore local requests to cease operations in the presence of school buses.”
“A child running into the path of a vehicle presents an incredibly dangerous scenario requiring an immediate response from approaching drivers, whether human or computer,” Brooks said.
Brooks said he thinks it is important for NHTSA to look into the incident to”determine if there were cues potentially missed by the Waymo vehicle that a human driver would have used to avoid a collision.”
“If AV companies continue to resist state and local governance, additional federal enforcement and oversight is going to be necessary to promote safe autonomous operations,” he said.











