Don’t Let the First Pedestrian Death by Uber's Self-Driving Car Freak You Out

A pedestrian in Arizona was killed by a self-driving Uber.

|
Mar 20 2018, 1:45pm

An autonomous Uber with a backup human driver in Pittsburgh. Image: Shutterstock

Nearly 6,000 pedestrians were hit and killed by cars in the US last year—an increase of nine percent from the previous year, and the highest number since 1990, according to the National Highway Traffic Safety Administration. Pedestrians aren’t alone; drivers, motorcyclists, and cyclists are all being killed with greater frequency on American roads. In 2016, 37,461 people died in traffic accidents in the US.

Meanwhile, a pedestrian in Arizona was killed by an Uber-owned vehicle in autonomous mode over the weekend, and we're willing to burn the whole driverless-car thing down. Uber has since pressed pause on its testing efforts in San Francisco, Arizona, Pittsburgh, and Toronto. Special interest groups such as the Teamsters union—which represents professional drivers in North America and which has previously lobbied the government against self-driving vehicles—are using the death of the 49-year-old pedestrian, Elaine Herzberg, to cast a shadow on the technology.

It is tragic that yet another person has died in a senseless road accident, and there are plenty of reasons to be cautious about autonomous cars and their makers. However, as a wise man once said: Check yourself before you wreck yourself.

I'm not a believer in the absolute superiority of robots, but when it comes to cars, there's an unavoidable truth: Humans are generally terrible drivers. Robots would probably be an improvement. That’s especially true when these robots get connected to each other and to the environments around them—i.e. traffic lights, road sensors, intersections, maybe even pedestrians’ smartphones—to create a level of operational safety we’ve never known on our roads. (We’ll likely grapple with cybersecurity and privacy for years to come.)

We fear autonomous vehicles (AVs) because they’re unfamiliar, and every misstep is seen as proof the robots are going to kill us all. I mean, they might, but it’s extremely unlikely to happen in our lifetimes.

They won’t even really need to kill us, because we’re so good at doing it ourselves. I have a Google alert to give me a daily digest of news stories containing the word “pedestrian,” and every single day I get at least one notification about another pedestrian in the US getting killed by a car. The World Health Organization says 270,000 pedestrians, and 1.25 million people overall, are killed each year on roads around the world.

Given these numbers, there is a certain failed logic behind calls to regulate the bejeezus out of autonomous cars when we’ve done so little to make roads safer for everyone.

Read More: Connected Cars Will Run on Your Personal Data

That isn’t to say AVs shouldn’t be regulated; they should. But the government loves to move slow, and the AV START and SELF-DRIVE bills that seek to federally regulate the AV industry in the US are both stalled.

“Congress must move quickly to update federal safety rules for self-driving vehicles and ensure NHTSA has the tools and resources it needs to strengthen regulatory oversight of automated vehicle technology,” Sen. Gary Peters, one of the architects of the AV START bill, said in a statement to Motherboard.

These two bills aren’t perfect, and give a lot of concessions to automakers, but they’re a start. Advocates for Highway and Auto Safety, a multidisciplinary alliance that has been outspoken on AV regulation, wrote a letter on Monday urging US legislators to sharpen its oversight of auto manufacturers, noting “the [AV START] bill still lacks essential safeguards that will assure sufficient government oversight.”

A Waymo self-driving car stopped at a traffic light in Mountain View, Calif. Image: Shutterstock

Until a bill is passed that creates strict rules about AV testing and deployment, every delay affords automakers the freedom to make up the rules as they go along.

Still, the absence of AV-specific rules does not mean there are no rules at all; there are still many vehicle safety rules automakers have to comply with, autonomous or not.

If only for capitalism’s sake, it’s in their best interest to deploy the most mechanically sound cars possible—to both avoid potentially company-ending liability lawsuits, and to finally recoup some of the $80 billion and counting of R&D they’ve poured into AVs.

It’s worth mentioning that Uber’s self-driving cars have driven more than two million miles. Waymo has driven five million miles autonomously. Back in 2016, Tesla claimed its cars had collectively travelled more than 222 million miles on autopilot. Other automakers and tech companies have likely driven millions of miles more. And after all that time on the road, we can count the number of AV-related fatalities on one hand.

It’s true many of these test miles were travelled in controlled environments that don’t necessarily replicate real road conditions. Federal regulations and millions more miles of real road testing are needed before this technology can be fully deployed in the US.

More people may die in AV-related crashes before that happens, and every single death is a tragedy. But, as we contemplate a future with self-driving vehicles, it’s worth remembering an exponentially larger number will still likely die because of human-driven cars.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.