Last night a woman was struck by an autonomous Uber vehicle in Tempe, Arizona

…except I would never say that, because that would be a sample of one, making it a bad example and a straw man.

The video illustrates a major problem with these autonomous cars with a “driver” to oversee the computer.

There is no way the “driver” is going to watch the road with the same diligence as a real driver would. In fact he looks half asleep. Specially at night.

5 Likes

The forward-looking video has been released, looks like the lady was in the lane for a couple of seconds, I have no idea why the car wouldn’t have ‘seen’ her.

3 Likes

I just saw the video. To me, this is a clear failure of the autonomous system. Lidar and radar SHOULD have identified the woman long before the headlights allowed visual identification of a pedestrian walking at night in dark clothing. Uber may have released this to avoid a lawsuit. In my mind it is confirmation that Uber’s self driving system failed miserably.

8 Likes

That’s a good point. The whole benefit of autonomous cars is that you can sit back, relax, and read or watch TV or a movie on your tablet or phone, or better yet, they can be used by people who are disabled.

The technology needs to evolve to where no driver oversight is needed. I look forward to that.

I was pretty disappointed to see the driver looking down when he was supposed to be monitoring the car’s operation. With that in mind, I consider this a human failure. This car was not intended to function without human oversight, and this driver wasn’t providing that oversight.

Had this car been deemed safe to operate without human oversight, I’d agree the automating system is completely to blame. In a sense, though, this was a system failure. The human was intended to be an integrated component, and this driver appeared to be disengaged.

1 Like

Yup, now that the video is out, the car definitely should have seen her. Makes you wonder if Uber’s legal troubles from stealing Waymo’s technology means they’ve re-designed to get away from infringement, and they didn’t do a good enough job.

@Whitey I agree - - the human was supposed to be monitoring. That said, we do know from earlier incidents with Tesla partial-autopilot that humans will pay attention to other stuff if the car is doing the work for them. I don’t think it particularly surprising that the human got bored with watching something else drive, and started looking at other stuff.

2 Likes

Well, it’s a test of the system that is intended to be 100% autonomous. This was a car failure, first.

5 Likes

It wasn’t deemed ready for 100% autonomous operation, though, was it? Isn’t that why it was being tested with driver oversight?

This boils down to a difference of opinion. I think the buck stops with the driver. He was there for a reason.

1 Like

They knew long before this accident that the ‘safety drivers’ are prone to inattention after hours behind the wheel. It’s been documented. So car/Uber first, driver second.

3 Likes

Autonomous vehicles have NOT been approved for use without a driver. It’s several years away.

Unfortunately in this phase of testing…there’s a chance accidents can/will happen. After each accident or incident the data needs to be analyzed and then make adjustments to the software or hardware. The point of the testing phases is to iron out any bugs.

The next phase is just going to be a much larger scale.

Even when autonomous vehicles are working perfectly…they there are going to be problems. Some edge-case scenario that no one thought of. It’s an evolving process.

It doesn’t surprise me either. I would not want to volunteer to do this kind of oversight myself in daily use for that reason. That’s also why I don’t try to disengage myself from the driving experience by using an automatic transmission and other tech.

Going forward, they should make clear that driver oversight should constitute 100% of your attention being paid to the operation of the car.

Having said all that, I do think cruise control is an exception to this truism. By not having to look down at my speedometer, I can better focus on what’s going on around me.

They probably did make that clear. My office has a lot of rules that we all break. I drink coffee at my computer. That’s forbidden per policy, but considering everyone does it, including the guy who put the rule in the rule book, I’ll only get in trouble for it if I spill and wreck my keyboard.

Same thing here - it’s probably policy that he devote 100% of his attention to the road, but he only got in trouble for it because something happened. Otherwise no one would ever have bothered to look at the footage. And since “it won’t happen to me” is a pretty common belief…

2 Likes

…and that’s why I think the buck stops at the driver. If I choose to ignore the rule, I do so at my own risk, even if everyone else is breaking the rule.

If we’re right, they’re going to make an example of him, and hopefully, fewer people will ignore the rule.

3 Likes

And BTW, because I’m sure at least one person in this thread is just wriggling with glee right now, the new facts don’t change my opinions on testing cars. This is why we test them. To find out where their weaknesses are and correct them before we unleash them unsupervised.

The true failure here, as @Whitey said, is that the guy who was supposed to be supervising the car, wasn’t.

This story should have concluded with “the car didn’t react properly and the driver had to intervene to prevent a tragedy. The fleet is grounded until we correct the problem.”

1 Like

So billion-dollar Uber puts a dangerous technology on the road, and the $20/hr driver takes the blame?

1 Like

Yeah, pretty much. Remember the AEGIS cruiser that shot down the Iranian airliner and killed hundreds of people? The guy driving the boat got the blame, even though “the billion dollar Navy put dangerous technology on the ocean.”

The technology isn’t any more dangerous than a normal car if it’s supervised properly by the guy getting paid more than 40 grand a year to do nothing but sit there and supervise.

3 Likes

Happens all the time. Every car on the road is dangerous technology. The person behind the wheel is in control.

When you turn on cruise control - do you blame the car for hitting the car in front of you because you didn’t hit the brake or turn off cruise control?

3 Likes

That remains to be seen. As I said, this is just a squabble about a difference of opinion at this point. None of us are legal experts as far as I know.

Nope. The woman was in the middle of the road, any ‘self driving’ technology should easily have seen her. This is far different than just ‘cruise control’.