A self-driving car operated by Uber struck and killed a woman crossing a street in Tempe, Arizona, on Sunday, March 18, raising safety concerns about just how ready these vehicles are to be driving on roads in our communities.
Although the vehicle had a human driver sitting in the driver’s seat, the vehicle was in autonomous mode when it hit a woman who was crossing the street outside a crosswalk at about 10 p.m., according to the New York Times.
Uber has been testing its self-driving cars in Tempe, in addition to Pittsburgh, San Francisco, and Toronto. The company has suspended its tests in those cities in response to the incident, according to the Times. The company says it’s cooperating with the investigations of authorities, including Tempe police and the National Transportation Safety Board, and that “our hearts go out to the victim’s family.”
Autonomous cars are likely here to stay and will only grow more ubiquitous as companies like Uber, Google-affiliated Waymo, and Tesla expand their driverless experiments on the road to a full roll-out. Accidents that carry various types of injury or fatalities are likely to happen again, as the cars’ computers use our communities as educational courses.
“We are more or less in their training ground,” attorney Mike Morgan says. “These machines are not able to anticipate what they don’t know. They are learning on the job. Until they experience a person crossing a street or darting out in front of them, they’re not trained yet on how to react.”
Contrast that with human drivers, who have encountered any number of unpredictable circumstances on the road millions, if not billions, of times before, Mike says.
As self-driving cars are deployed in more communities to gather more data to help them improve their driving ability, federal and state lawmakers will need to address how to properly regulate them.
California, already prime territory for self-driving car experiments, plans to start allowing companies to test autonomous vehicles without backup drivers behind the wheel in April, something Arizona already allows, according to the Times. It’s that relatively laissez-faire tack which has made Arizona attractive territory for companies looking to experiment with driverless vehicles.
On that note, this isn’t the first time that one of Uber’s test cars has been involved in a crash in Tempe. About a year ago, a self-driving Uber car was knocked on its side after another vehicle hit it. There were no injuries in that crash.
Other car companies have experienced issues with their autonomous or semi-autonomous vehicles. Tesla’s Model S has an autopilot mode that is supposed to be an assisting feature — the driver is still supposed to keep their hands on the wheel at all times and, if they don’t, the car sets off warnings. However, this putative “super cruise control” has already led to two fatalities when the feature is relied on as a driverless solution. (Tesla announced in February that in three to six months it plans to do a coast-to-coast, fully-autonomous test drive, according to a shareholder letter from the company.
The idea behind self-driving cars is that they will ultimately make driving safer, but it’s clear that they are still a huge safety risk. California has approved 111 car models for testing already, and a BI Intelligence report predicts there could be as many as 10 million self-driving cars on the road by 2020.
Google and Uber have already indicated in the past that they plan to deploy vehicles that lack traditional controls like a steering wheel or pedals, removing the ability for passengers or a backup driver to intervene. Although we don’t know the facts of the Arizona fatality, it is of note that even a human safety driver seemingly couldn’t prevent Sunday’s fatal Uber crash in Tempe.
Overall, it will come down to this: carmakers, tech giants, and lawmakers will need to agree to reasonable and sensible regulations that place the safety of humans above all else. Until we get there, the probability of safety risks will remain high as companies test out their autonomous vehicle technology.
“This is an anticipatable problem,” Mike says. “We know we’re going to encounter it more and not less as cars start to drive themselves.”