Last night a woman was struck by an autonomous Uber vehicle in Tempe, Arizona

I have no idea what happened, and obviously this is a tragic situation…but based on past autonomous accidents - there’s a high probability it was her own fault and she would have been killed if there was a good driver behind the wheel who was obeying all the laws.

More than 1.2 million people die in car collisions each year worldwide. That’s an average of more than 3,200/ day.

In the U.S., the number was 37,461 in 2016, or 102.6 per day on average.

Until autonomous cars come close to these numbers, this doesn’t compare.

Absent the details in this case, it seems illogical to base conclusions “on past autonomous accidents”.

1 Like

@MikeInNH and @Whitey, you are both right. But remember that this is not just a technical and logical issue. For most people this is an emotional issue. They don’t want to relinquish their right to drive a car (and kill people themselves). Incidents like this will serve to reinforce their bias against self-driving cars. In an age where facts and science no longer seem to matter, one unfortunate incident involving a self-driving car will serve to make the transition to safer roads that much harder.

1 Like

I 100% agree with bloody_knuckles.

Also, while I am a big believer in autonomous cars, don’t neglect the possibility that this will turn out to be a very shameful root cause for Uber. Remember “the van was the same color as the sky”?

Not all autonomous cars are the same. Waymo is much more advanced than Uber. That’s not even a contest.

1 Like

Agree and I am one of those but there are also those that don’t want the responsibility of controlling the car and wish to push that off on the machine.

This is popular meme in some circles. In reality it is generally a disagreement in the interpretation of those facts. Facts may not have bias, but those who draw conclusions from those facts do have bias.

1 Like

I specifically said there was a high probability. Never said it WAS the woman who was killed fault. But so far the autonomous vehicles have been pretty safe, and a high percentage of the few accidents they’ve been in have been the result of the other person. This may actually turn out to be the fault of the autonomous vehicle.

Yes, you included “a high probability” as part of your conclusion “based on past autonomous accidents”. It doesn’t make basing a conclusion on past incidents any less logical when the details of this incident aren’t in hand.

“This may actually turn out to be the fault of the autonomous vehicle.”
–yes, it might turn out that way.

I didn’t start this thread because it is “an emotional issue” for me or because I have “bias against self-driving cars” or because “facts and science no longer seem to matter” to me.

Instead, I’m wondering why the facts/science/engineering of self-driving cars is so different from other areas, such as clinical trials for new pharmaceuticals.

For Uber to be using such a vehicle, I expect the vehicle and system to have been tested enough that such an accidental death would be a great unexpected shock. The lack of surprise might be suggesting a pre-acceptance of such accidental deaths in the future.

I’d run down probably 2-3 pedestrians a week if I didn’t see them and slow down or stop when they crossed against the light or between crosswalks. True, it’s based on speculation at this point, but I very much doubt an actual driver would have hit and killed that pedestrian.

1 Like

Sounds like she was not in a cross walk. Um so what else is new? Lot’s of folks don’t use cross walks and have ear buds. We always learned in defensive driving that it takes two mistakes at the same time for an “accident” (or crash if PC). The idea is for one of them to avoid the mistake and avoid the accident. So obviously we could say it was the victim’s fault but an alert driver can sometimes avoid them, and she would be alive.

1 Like

It was night, she wasn’t in the crosswalk, there was a backup driver at the wheel: is s/he prosecutable? I’d think so, but I’m no lawyer. I’m willing to wait for the investigation and the likely trial.

First off - there was a driver. So why didn’t he hit the brakes? The feature to stop if something or someone is in your way is in many non-autonomous vehicles. It seems to work very well.

Seems like there’s the possibility of Uber being held criminally liable:

Given the faster reaction time of the autonomous car, the accident was likely unavoidable by a driven car or the autonomous car. Physics is physics. If it takes 30 feet to stop and the pedestrian is 20 feet away, well… A human takes longer to react than the automated system and the problem should be worse.

Now an automated system doesn’t get “hunches” that a pedestrian is about to dash between parked cars or step out behind a post or something, now does it? Humans can, based on experience, get those “hunches” and prevent collisions. I see that a the gray area of automation. Autonomous cars need to be right everytime while a driver can just say “I didn’t see her!” to avoid liability.

Much as we have a similar lack of surprise when a human-piloted car runs someone over. It happens. It’s going to happen, no matter who or what is driving.

Making 2,000 pound + object move around at high speeds is inherently risky. There will be mishaps that will result in injuries and deaths. We’ve already accepted that when humans are driving, but now machines have to have a zero mishap rate? That’s silly.

Now that the detail has come to light that the woman darted right in front of the car, it’s pretty clear that she would have been run over no matter how good the AI driver was. As noted above, AI can significantly lower emergency reaction time, but it cannot re-write the laws of physics.

If you dart in front of a fast-moving heavy thing, you are probably going to die. The onus is on you to stay out of the path of fast moving heavy things.

1 Like

Maybe you missed my point, so here it is again: “For Uber to be using such a vehicle, I expect the vehicle and system to have been tested enough that such an accidental death would be a great unexpected shock.”

And regarding your acceptance, US motor vehicle fatalities were about 1 per 100 million miles driven in 2016. If the Uber vehicle and system had been tested for about 100M miles and this fatality occurred, then it would be no better than human drivers. If the testing is less than about 100M miles, then it is worse than human drivers.
I would expect Uber to use something that they know and expect to be better than human drivers.

The details, and their interpretation still seem to be coming in. I’m not jumping to conclusions.

I saw it. Did you miss mine? It would be a great unexpected shock if there were plenty of time for a human driver to avoid the accident but the AI driver ran her over anyway. It is not a great unexpected shock for a vehicle to be unable to stop instantaneously when someone darts out right in front of it, even if the driver is not human.

This demonstrates a lack of understanding of how statistics work. “1 per 100 million miles” does not mean “you are guaranteed to be able to drive 100 million miles before anything bad happens.” It means that on average, you get 1 incident every 100 million miles driven. Sometimes you might not get any incidents for 500 million miles. Sometimes you might get 5 in one day.

It is not necessary to actually drive 100 million miles in order to get a 1 per 100 million miles statistic.

2 Likes

It IS testing. They are in the next phase of testing. Autonomous vehicle with a backup driver. At some point this type of testing needed to be done. The question is - was this phase too soon? Not sure. 2 years ago I didn’t think they were. But this technology is changing real fast. An autonomous vehicle with no driver is still 10+ years away.