Last night a woman was struck by an autonomous Uber vehicle in Tempe, Arizona

We all agree that self-driving tech should be able to see through or around solid objects. That’s what theoretically makes it safer than a human driver.

Maybe this difference of opinion is a philosophical discussion rather than a technical one, and some of us favor Harry Truman’s iconic sign.

There were no solid objects. She was in the middle of the road, not darting out between cars. Should have been an easy object for the tech to see.

We all agree the technology failed.

Where I work, I have been assigned a golf cart for getting around campus in my official capacity. Recently, a leaf spring clamp on the left rear wheel broke, and the tire rubs against the fender when I make right turns. Our fleet mechanic says the golf cart is safe to drive in this condition, but I looked underneath, and one of the leaf springs has turned 90 degrees, making it vertical instead of horizontal. (This is one reason I regard our fleet mechanic as a hack. I don’t think highly of him or his professional standards.)

Although I was given the go-ahead to drive my golf cart, I have elected to take it out of service until it is repaired, because I know that, as the operator of the vehicle, the ultimate responsibility for operating it safely resides with me. I tend to hold myself accountable that way. That’s my philosophy on the operation of a motor vehicle, that it’s up to the operator to ensure its safe operation. Anything else is passing the buck.

I could continue to drive my golf cart, and rely on my mechanic’s opinion as an excuse to avoid responsibility when something happens, but that isn’t how I roll. Perhaps that is why we have a difference of opinion on where the responsibility ultimately resides on a car that was not yet deemed safe for full autonomous operation, and was supposed to be monitored by a driver.

3 Likes

If I was Uber, and my company’s reputation/value/future depended on successful (as in ‘don’t kill anyone’) testing, I would have some way to monitor ‘safety driver’ attention, periodic alerts/tasks, whatever. Just putting them in the car, regardless of the amount of training, wouldn’t be enough. One bad accident, and see what happens.

3 Likes

I completely agree. The standard should be (has to be?) “don’t kill anyone” testing.

As for monitoring the driver’s attention, I suspect the internal cameras are part of that, which raises the question of whether the video of each driver is reviewed afterwards to see if they are paying attention enough to be sent out again on another drive. If I were representing a side against Uber et al., that would be part of discovery. It is possible that this human monitor/driver has past examples of not watching the road enough.

I’m very reluctant to put all, or even most, of the blame on the human monitor/driver in the car right now. The videos from various outlets that I’ve seen might be slowed down somewhat (I can’t tell for certain) so that it seems like a human driver had more time to react, such as by swerving or braking.
And even if turns out the person actually had time to react, it is still possible that “being human” means failure to react or reacting poorly under such conditions.
If this means I have sympathy for the person in the car as well as the woman who died, so be it.

OTOH if the technical descriptions of radar/lidar are correct, the Uber vehicle/system has no excuse, especially given that the woman had apparently walked across about 3 lanes of open road before being in front of the Uber vehicle.

1 Like

Are the driving monitors entering information on laptops? To be commensurate with the average driver the human driver and the AI driver must be answering texts and eating F-Fs while weaving their way along.

I don’t know the answer to your question (if it wasn’t rhetorical).
I do think that the standard is not (or should not be) for the AI to be no worse than a human driver.

1 Like

It does seem to me that the human behind the wheel(?) was occupied paying attention to something but the rest of the post was my usual sarcastic C-rap.

The thing to consider is not that the driver might have reacted faster than the car, the thing to consider is that some people in this discussion believe a human driver is superior because he or she can slow down after they’ve recognized the potential for a hazard, while an automated car might not slow down until it perceives a hazard.

In other words, some of the users here claim they would have been going slower than the automated car to begin with, having assessed the risk of a blind spot at night. That is what I think this driver should have done. He should have overridden the automated controls after recognizing there was a blind spot in front of him, a potential hazard. That’s what I would have done, and based on the comments above, that is what several others would have done.

When I was in truck driving school, we were told a story. For all I know, it might be a myth, but the lesson stuck with me:

A man and his children were parked on the shoulder of a highway in their car. The man was suicidal, and was waiting for a semi to come along so he could pull out in front of it. As a semi approached, the driver saw the parked car, but did not react. The driver pulled in front of the semi and all but one of the children died. That surviving child testified in court or a deposition that the father was attempting to kill them all by deliberately pulling in front of the tractor-trailer. Nonetheless, the court held the professional truck driver liable for the collision, because he saw the car, recognized the potential hazard, and had the ability to change lanes before he got near the car on the shoulder, but he didn’t move over.

There are now laws that mandate you move over to the next lane if there is an emergency vehicle on the shoulder, but truck drivers are instructed to move over for any vehicle parked on the shoulder if they are able to do so, and they should slow down if they’re not able to get over, just like we are supposed to do for emergency vehicles. Even though I am no longer a truck driver, I still do this when I can safely get over, and it’s saved my butt more than a couple times when some idiot pulled onto the highway without accelerating on the shoulder first.

All drivers should scan ahead for potential hazards, but that goes double for those who drive professionally, even if they’re monitoring an automated vehicle. That’s their job, and that’s why I think this driver is largely responsible, not because he should have been able to react faster than the car, but because he should have been actively assessing potential hazards and reacting as if he was driving the car himself.

I’m pretty sure that, legally speaking, even if the driver was at fault, Uber is liable, he was acting as their employee.

2 Likes

Most definitely. I’m no legal scholar, and there is plenty of blame to go around.

I understand what you are saying. Yet if the human monitor/driver might have instructions, or (mis)understood instructions, to be that each overriding of the autonomous system (slowing down, changing lanes, etc) as a precaution is defeating the purpose of the test – AND pitting her/his personal judgment against that of the system.

I/m just saying a human would have made a split second decision to crank the wheel hard left while applying the brakes. Just like trying to avoid a deer or at least avoid a full on impact. It’s just an automatic reaction. But expecting a monitor sitting there to quickly grab the wheel and take control in time is just nonsense. No way that will happen. You have to first be in full control of the vehicle. Turning the wheel can often reduce or avoid accidents much faster than braking. But in this case you have two failures at least at the same time. The failure of the car to detect the person and the failure of the monitor to over-ride the computer. Or on the other hand a failure of the administration to expect the monitor to be able to avoid this type of accident. Back to the drawing board or keyboard as it were.

Would it defeat the purpose of the test, or would the programmers learn something from the driver overriding the programming in hazardous situations? I dare not speculate when someone’s life is at stake.

Picturing that maneuver, I hope you have ABS. Otherwise, it would lengthen your stopping distance, and you might skid into the pedestrian with the side of the car rather than the front.

That can always be determined post-incident. The human should take over if there’s any question as to the outcome.

Actually, that brings up a question I haven’t seen asked yet. According to reports, the car was speeding. Why would the computer allow the car to speed? I almost wonder if the autopilot didn’t disconnect, and no one was controlling the vehicle.

Yeah, but computers tend to be smarter than people give them credit for. They can forecast potentials. Especially since this potential shouldn’t have been hard to forecast - transient object moving on an intercept course with the car might just intercept the car.

I’m in the camp of “this was a test and the test supervisor wasn’t supervising,” but I’m also in the “once perfected, AI is going to be worlds safer than human drivers” camp.

1 Like

Please note that I am not one of those who claim a human driver is superior. That third quote is out of context. Here it is in context:

Perhaps the speed limit had been changed but the software was not upgraded to reflect that.

On my commute to school on Tuesday, there was a newly-constructed and newly-opened exit ramp that dumped me right where I need to be in the middle of the campus. When I took the ramp, my GPS app made it look like I was going off-road or airborne like that scene in Blues Brothers where the Nazis drive off the edge of a partially constructed overpass.

I wasn’t trying to change the meaning of what you said - just wasn’t precise enough with the copy/paste.

Which would be an odd way to program it. The map-recorded speed should be the fallback for if a speed limit sign isn’t available. Even non-self driving cars these days can read speed limit signs and display the speed limit to the driver.

1 Like