Waymo Robotaxis Are a Driverless Danger on the Road: What You Should Know About the Recent Recall

4 min read time
Media image.

Key Takeaways

  • Waymo recalled nearly 3,800 robotaxis over a dangerous software issue. The vehicles may fail to properly detect hazards like flooded roads.
  • Incidents like this suggest companies may be putting experimental self-driving systems on the road before they are fully safe.
  • Victims of autonomous vehicle incidents may have legal options. Injuries involving robotaxis could involve complex liability across manufacturers, software developers, and operators.
  • Morgan & Morgan can help hold companies accountable. If you were injured by a self-driving vehicle, contact Morgan & Morgan for a free case evaluation to explore your legal options.

Injured? 

We can help.

As companies race to put driverless technology on public roads, incidents involving robotaxis are raising serious concerns about whether the technology is truly ready for real-world conditions.

Now, Waymo is recalling nearly 3,800 robotaxis after federal regulators identified a dangerous software issue that could cause the vehicles to drive into flooded roads. According to reports, the recall affects certain fifth- and sixth-generation automated driving systems currently operating in cities across the United States.

For many consumers, the recall reinforces a growing fear: Are companies putting experimental technology on public streets before it’s safe enough to protect the people around it?

What Happened in the Waymo Recall?

The recall stems from an April 2026 incident in San Antonio, Texas, where an unoccupied Waymo vehicle reportedly drove into a flooded roadway during severe weather. Federal safety regulators said the software may fail to properly avoid flooded roads, even when those conditions create serious hazards.

The recall impacts approximately 3,791 vehicles equipped with Waymo’s automated driving systems. As an interim measure, regulators said Waymo has updated its maps and increased weather-related operating restrictions while working on a more permanent software fix.

Thankfully, no injuries were reported in the Texas flooding incident. But critics argue that the lack of injuries does not excuse the larger issue: a self-driving vehicle should never have entered a dangerous flood zone in the first place.

A Growing Pattern of Safety Concerns

The flooded-road recall is not happening in isolation. Waymo has faced mounting scrutiny over other incidents involving its autonomous vehicles.

Federal investigators are currently reviewing incidents involving:

  • A Waymo vehicle striking a child near a California school
  • Robotaxis allegedly passing stopped school buses with flashing lights
  • Vehicles becoming stranded or behaving unpredictably in public traffic situations
  • Concerns about autonomous systems operating during dangerous weather conditions

As self-driving fleets continue expanding into more cities, many safety advocates question whether companies are using public streets as testing grounds for unfinished technology.

The issue becomes even more concerning when you consider the complexity of real-world driving. Flooding, emergency vehicles, school zones, construction, pedestrians, cyclists, and unpredictable weather are all situations that require split-second judgment. When autonomous systems fail, innocent people can get hurt.

Were These Vehicles Put on the Road Too Soon?

Tech companies often frame autonomous driving as innovation, but innovation does not excuse negligence.

When companies deploy thousands of driverless vehicles on public roads, they take on a serious responsibility to ensure those systems can safely respond to dangerous conditions. If the technology cannot consistently identify hazards like floodwaters, many people understandably question whether these vehicles belong on public roads at all.

Consumers did not volunteer to participate in a large-scale real-world experiment involving autonomous vehicles. Yet pedestrians, cyclists, passengers, and other drivers may all be exposed to risks created by systems they never agreed to trust.

And while companies often promote statistics about autonomous driving safety, recalls and investigations continue to expose situations where these systems allegedly failed to behave safely in critical moments.

Who Could Be Harmed by a Self-Driving Vehicle?

Autonomous vehicle accidents can affect far more people than just passengers inside the car. Potential victims may include:

  • Drivers of other vehicles
  • Pedestrians
  • Cyclists
  • Children near schools or intersections
  • Motorcyclists
  • Passengers inside robotaxis
  • Individuals injured during weather-related incidents

In some cases, victims may suffer severe injuries, including:

  • Broken bones
  • Head trauma
  • Spinal cord injuries
  • Internal injuries
  • Emotional distress
  • Long-term disability

When a crash involves autonomous technology, determining liability can become extremely complicated. Multiple parties may potentially share responsibility, including the autonomous vehicle company, software developers, manufacturers, maintenance providers, or third-party contractors.

That complexity is one reason why victims should not attempt to navigate these cases alone.

Can Victims Take Legal Action Against Autonomous Vehicle Companies?

Potentially, yes. If a self-driving vehicle causes an accident or injury, victims may have legal grounds to pursue compensation. Depending on the facts of the case, claims may involve allegations such as:

  • Negligent software design
  • Failure to properly test autonomous systems
  • Defective vehicle technology
  • Failure to warn consumers or the public
  • Unsafe deployment of autonomous fleets
  • Negligent operation of self-driving vehicles

Compensation in these cases may potentially include damages for:

  • Medical expenses
  • Lost income
  • Pain and suffering
  • Rehabilitation costs
  • Future medical care
  • Emotional distress
  • Wrongful death damages in fatal cases

As autonomous vehicle technology evolves, courts and regulators are still grappling with how liability should be handled. But one thing remains clear: companies cannot simply avoid responsibility because a computer was driving the car.

Why These Cases Require a Powerful Law Firm

Autonomous vehicle litigation is likely to become one of the most complex areas of personal injury law in the coming years.

These cases may involve massive corporations, sophisticated software systems, technical engineering evidence, electronic driving data, and teams of defense attorneys working to limit liability. Victims can quickly find themselves overwhelmed while trying to recover from serious injuries.

That is why having experienced legal representation matters.

As the largest personal injury law firm in America, Morgan & Morgan has the resources to stand up to some of the biggest corporations in the world. We have recovered billions for clients and have experience handling complex injury litigation involving large companies and emerging technologies.

When corporations allegedly put unsafe products or systems into the marketplace, injured victims deserve answers, accountability, and the opportunity to seek compensation.

Contact Morgan & Morgan

Self-driving technology may represent the future, but companies still have a responsibility to protect the public today.

If you or someone you love was injured in an accident involving a robotaxi or autonomous vehicle, you may have legal options. An attorney can help investigate what happened, determine who may be responsible, and explain whether you may be entitled to compensation.

Morgan & Morgan fights For the People, not powerful corporations experimenting with public safety. 

Hiring one of our lawyers is easy, and you can get started in minutes with a free case evaluation.

Disclaimer
This website is meant for general information and not legal advice.