Uber’s self-driving cars had a major flaw: They weren’t programmed to stop for jaywalkers

The Center for Auto Safety is the nation’s premier independent, member driven, non-profit consumer advocacy organization dedicated to improving vehicle safety, quality, and fuel economy on behalf of all drivers, passengers, and pedestrians.

“Even the most junior human driver knows to expect that people sometimes walk outside of crosswalks,” said Jason Levine, executive director of the Center for Auto Safety, a Washington, D.C.-based nonprofit organization. “When we have public road testing that has programming that essentially chooses to ignore the realities of how people interact with public infrastructure, it’s really dangerous.”

The bigger question for Levine: Could similar serious oversights plague other models across the self-driving industry?

“The answer is, we don’t know, because there’s no requirements around how you program this technology before you put it on the road,” he said.

Levine said he wants to see federal regulation.

Software detected the woman almost six seconds before Uber’s self-driving car struck her, investigators say, in the crash that would lead to her death and prompt the ride-share giant to slam the brakes on its autonomous vehicle testing.

But the SUV didn’t start to stop until about a second before impact. One big reason: It wasn’t designed to recognize a pedestrian outside of a crosswalk, according to documents released this week by the National Transportation Safety Board after a 20-month investigation.

Revelations that Uber failed to account for jaywalkers — with deadly results in Tempe, Ariz., in March 2018 — fuel long-standing objections from critics who accuse companies such as Uber of rushing to deploy vehicles not ready for public streets. They are skeptical that automakers eager to lead on industry-transforming technology are doing enough to avoid another tragedy as they continue to test out cars in the real world.

Click here to read the full story from the Washington Post.