A Waymo Driverless Car Hit A Child, And Now The NHTSA Is Involved

Parents teach their children to look both ways before crossing the street because drivers in their cars may not see them. Now it seems that lesson should be expanded, as driverless cars may also not see them, after a Waymo robotaxi struck a child.

The incident happened on January 23, 2026. A child near an elementary school ran past a double-parked SUV and into the street. The child ran in front of the driverless Waymo car and was struck. The good news is that the child is reportedly alright with only minor injuries. Waymo itself reported the incident to the National Highway Traffic Safety Administration (NHTSA) which has since opened an investigation. The NHTSA operates across the U.S. with 10 regional offices, working toward ensuring safety on the roads with laws, regulations, and car manufacturers.

Waymo has made a lot of waves recently as an innovative, driverless robotaxi service, expanding to major cities across the U.S. This incident took place in Santa Monica, California, where Waymo has been operating since October 2023, and effectively refreshes the debate about the safety of driverless cars, and if a human driver would've also hit the child if they were in the same situation.

Details of the Waymo incident

The incident happened within two blocks of an elementary school according to a report by the NHTSA. There was a crosswalk in the area with a crossing guard and other children headed to and from the school. The child that was hit, however, was not using the crosswalk, and was running independently across the street to get to the school. To make the situation even more dangerous, the child was crossing behind a double-parked SUV that was blocking off visibility.

This Waymo robotaxi was driving without any human oversight inside the vehicle, and it was using Waymo's 5th Generation Automated Driving System. This system has been around since 2020 for Waymo. It offers full 360-degree visual capabilities for the car, with radar on the front bumper, the hood, the back bumper, near the taillights, and two different camera systems on top. Waymo thinks its system can make streets safer by avoiding incidents that stem from reckless human driving.

The NHTSA's Office of Defects Investigation (ODI) is still investigating the incident as of the time of writing. There are a handful of things the ODI is looking into that will determine whether or not it believes Waymo is at fault. These include whether the Waymo car was adhering to all laws and posted speed limits, and whether it was showing expected caution in a school area.

Is Waymo to blame or not?

Incidents like this reopen the conversation of if self-driving cars with no human oversight are actually safe in unpredictable areas like around a school. Children can behave erratically around roads, so drivers are expected to be extra cautious. During the ODI investigation, the question will be considered if a human driver would've behaved in the same manner the Waymo vehicle did, of if they would have been more cautious. A human driver would know to watch out for children behaving unexpectedly near streets, and know that a double-parked SUV presents an even greater visibility challenge, therefore exercising appropriate caution.

However, is Waymo truly at fault? The child did run out into the street without using the crosswalk. There's also the issue of the double-parked SUV. Double parking is illegal in California. It obstructs the road and blocks the visual range of other drivers and pedestrians. It brings up an interesting question of if the Waymo car would've noticed the child if that SUV had not been double-parked.

So far there's no information on who the driver of that SUV was and why they were parked in such a manner. This brings to mind when a Waymo vehicle was seen parked in a bike lane illegally, however, it was actually a human driver in manual mode that did the parking. It'll be interesting to see what the NHTSA ultimately determines.

Recommended