Last night a woman was struck by an autonomous Uber vehicle in Tempe, Arizona

The time elapsed between the incident and the legal settlement was very short. The stakes for Uber were huge, in terms of public opinion and its effect on Uber’s future.

All of that is true, and yet it doesn’t disprove my point that there are other unknown factors that would affect the size of the settlement.

While we don’t know the $$, the stakes for Uber (publicity, etc.) are huge. That’s how I interpreted the comment. They would have paid big $$ to get this out of court. We’ll never know the $$, most likely.

1 Like

The fact that it was settled quickly - tells me the amount was large. 7 figures or even 8 figure settlement. If it wasn’t large, there’s a highly likelihood they would have held out for more. I admit it’s a guess…but an educated guess.

1 Like

Thank you for saving wear and tear on my fingertips. You give an example of what I have learned over many years and many miles behind the wheel. There are two old pieces of advise from military aviation that also apply to operating a motor vehicle. “Expect the Unexpected” and “Be Ready”. Your experience would also put me in “extreme caution mode” including an escape route such as the oncoming lane if needed. I guess sensors and computers could accomplish this. Unfortunately there are some situations where physics rule and a human or computer driver cannot avoid the collision. Sadly I know this firsthand.

My guess would be $10 million. A check waved in front of their faces to help relieve the pain or the alternative of years in litigation. Maybe 10 mil a piece to shut everyone up fast. But that’s pure speculation on my part. Maybe that’s low these days but most people have a price to go away.

I would surmise others kicked in; for example, companies involved in the creation of the hardware and software the car was using.

The software is written by Uber. And I seriously doubt that the hardware companies contributed even one dime.

Tesla now says that crashed one in California was on autopilot.

1 Like

Sorry I came so late to this discussion and apologize for not reading all the posts, but who could. I just had a few points.

A statistic was quoted as driving deaths occur once in 100 million miles. That would be for all miles, most of which are driven on interstates. The fatality incidence for non interstates is much worse.

“If airbags save just one life , they are worth it” No, if airbags only save one life they would definitely NOT be wort it. Air bags have killed and injured people, we add them to cars even though they add weight and cost because they save ENOUGH lives to be worth it.

Any testing process must reach a stage where it gets tested in the real world. Self driving cars are not and never will be perfect but they should be acceptable, if they are much safer than human drivers. There are leagal and ethical problems to be worked out. For example Should a self driving tractor - trailer run over a pedestrian if the alternative is hitting a school bus full of kids whose driver ran a red light? And the programmer is going to have to make that decision for any number of pedestrians of number of people on the bus and degree of fault for the bus driver.

A Buffalo city Police car driven by an officer who was responding to a call without lights and sirens yesterday, struck and killed a pedestrian out for a walk at 6:30 am. Does think that will generate this many comments anyplace.

Because you brought up programming, it seems possible that autonomous driving (AD) cars will have to include decision making a choice between a collision that likely injures/kills multiple persons versus an action that likely only injures/kills a single occupant in the AD car.
And if the programming means that decision is made by choosing to likely injure/kill the occupant (instead of the multiple persons), do people really want that AD car?

Here’s a link to that news:
http://www.wgrz.com/article/news/local/woman-killed-in-buffalo-police-officer-involved-accident/71-533461746

Buffalo Police say the patrol car involved does not have a dash camera. Investigators will look to see if city surveillance cameras or business surveillance cameras show the accident.

Police did not say if the officer had his lights and sirens activated at the time of the accident.

And in answer to your question, hopefully, it will generate more than 171 comments at least among people in Buffalo.

No, it won;t, it is just another traffic accident. A pedestrian struck and killed in Arizona would not gather national attention if it was driven by a regular driver.

It’s just that one is news and the other is not. Unusual accidents tend to get reported and discussed more than ones that happen all the time.

Sensors in cars are not able to resolve to occupant count. What they can do is gauge size, speed and trajectory. I expect an algorithm to choose the path of least destruction based on those factors. For example, a larger size is likely to equate to more mass but it won’t distinguish between a school bus and a box truck. As the situation unfolds, the computer should assess relative masses and the energy of each and potential conflicts for the path of the vehicle it controls. It would be better to run head on into a semi going 10 mph than a small car going 50. Size does not necessarily mean it should be avoided under all circumstances. It’s going to try and avoid a collision but if one is imminent, it will choose the least impactful (sic) based on data inputs. Humans could make decisions more based on emotion- avoid the bus and kill everyone else when the bus choice may have been the correct one from a minimizing damage perspective.

If an AD vehicle can’t “count” the number of fastened seatbelts and/or (approximate) weight in seats to gauge the number of occupants, that would seem very unusual – especially in cases of vehicles for Uber and the like, where the number of occupants might be legally relevant.
And assessing semis, buses, and small cars is certainly important, yet there are simpler situations, like multiple persons in the road versus one occupant in an AD vehicle.

I think twin turbo meant sensors in one vehicle can not discern the number of occupants of another larger vehicle; big truck or school bus filled with kids. Or empty school bus vs. filled.

1 Like

Not yet. But the days of cars talking to each other are coming. Your car will be reporting its pax count, and it will be receiving the pax count of everything around it, among other things (such as automatically making a path for an ambulance without having to stop, because every car will know where the ambulance is going). Far in the future, obviously, but system capabilities are going to be pretty amazing when they get here.

Probably so.

At that point, we won’t be discussing decisions involving least impactful choice. Once we have full autonomy and inter-vehicle communications, there won’t likely be any surprises to contend with. Even when we insert unpredictable human pedestrians along the roadway. Decisions will be distributed with full awareness by all autonomous vehicles and the best choice will be executed by all. I hope I’m still alive to see it…

1 Like

Your understanding of AI and programming in general is obviously limited.

The A in AI means artificial. It’s NOT real thinking. Those decisions are not made by a THINKING computer. It’s actually made by the programmer who’s programming the software. They program in scenarios that will tell the vehicle exactly what to do if this situation ever arises. There are THOUSANDS of them programmed in.