Report: Self-driving Uber could not respond to jaywalking

On Behalf of | Nov 10, 2019 | Ride Share/Taxi/Transit Injuries |

Before any self-driving vehicle should be allowed on our public roadways, it should be fully operational. That includes being able to avoid hitting other vehicles, motorcyclists, bikers and pedestrians. Even one incident of an autonomous vehicle failing to notice a person in its path is too many.

Last March, an Uber self-driving vehicle was allowed onto the streets of Tempe, Arizona, for test purposes. It struck a 49-year-old pedestrian, killing her. When the National Transportation Safety Board investigated, it found two important problems with the vehicle and announced them this month.

First, Uber had disconnected automatic braking and was relying on the human driver to respond in case of an emergency. Second, and more shocking, the system had not been designed with real-world conditions in mind. No consideration was made for pedestrians who jaywalk.

“Clearly there was a technological failure in the sense that this was not a mature system,” a law professor who studies autonomous vehicles told the Associated Press. “The response to ‘I don’t know what is in front of me’ should absolutely be slow down rather than do nothing.”

No federal regulations for testing autonomous vehicles

The NTSB’s investigation provides a crucial view into the cause of this crash, but the agency has no regulatory authority. It can only make recommendations for how to avoid future incidents.

If those recommendations are to be made official, either the National Highway Traffic Safety Administration (NHTSA) or state or federal lawmakers would have to take action.

There are currently no federal laws or regulations in place for testing autonomous on our nation’s roadways. According to the AP, NHTSA is taking a hands-off approach in order to avoid slowing down the technology. Ultimately, autonomous vehicles are expected to be safer than those driven by humans, assuming the companies can overcome problems like the one that led to this fatal crash.

But a senior policy analyst for Consumer Reports said the NTSB’s report reveals “outrageous safety lapses” by Uber and called for enforceable regulations.

“We hope Uber has cleaned up its act, but without mandatory standards for self-driving cars, there will always be companies out there that skimp on safety,” he said.

When a self-driving vehicle injures someone during a test, the victim has rights under the law of product liability. Companies can be held strictly liable for harm their products cause, even if the product has the potential for a promising future.