'Americans are right not to trust self-driving cars'


Whitey, it’s the PROPONENTS of automation who are starting this. (The more clueless amongst them giving 2030 as the “over/under” for when this happens.) I fail to see how you can belittle this fear, when the computer geeks are OPENLY STATING this as their endgame!

(And “slippery slope” is a fallacy now, huh? In legal context, it’s called “precedent.” And precedent is all too real. Must be nice, to sit stop an ivory tower and deem every argument you disdain a “fallacy.”)


HUH??? Show me please.


I’m a proponent of automation, and I think manual driving will eventually be either outlawed or relegated to very rare and expensive sertification requirements. But that doesn’t mean I’m a proponent of outlawing manual driving. It’s just an inevitable outcome of automation.

As I said more verbosely above, those of us who actively enjoy driving are vastly outnumbered by the people who drive because they have to in order to get somewhere. Those people are going to turn driving over to the computer the minute it’s technically feasible for them to do so.

Eventually those people are going to get irritated at the idea of having to have a license to do the thing that they will never do again, and then they’re going to seek legislation that allows people to be a passenger in an auto-drive car without having to have a licensed driver in the car.

So now the DMV suddenly sees an enormous drop in business in the driver license line because only the small percentage of us who like to drive are bothering to get a license.

At some point interest in getting a license will fall off sufficiently that DMVs will have to close, because it will be considered stupid to waste millions of taxpayer dollars to serve maybe 20 people a month.

At best we will revert to a system more like general aviation. Currently there are vastly fewer flight schools than there are driving schools. The federal government handles pilot licensure, in part because no state wants anything to do with it. It’s expensive and serves a tiny fraction of the population. Sound familiar?

But it is I think more likely that manual driving will simply end up being relegated to private motoring parks (race tracks, etc) in part because manufacturers will push for it because it’s a whole lot cheaper to build a car that doesn’t have any driver controls in it.

In the end, it doesn’t matter whether I’m for automation or not, because it’s going to happen (and incidentally is going blow up the economy when it comes because suddenly everyone who has a job driving something is going to get fired). As it happens I’m for automation in principle, but am not looking forward to the initial fallout including the loss of driving privileges that will follow.

But a point that needs to be stressed is that acknowledging the inevitability of something is not synonymous with being in favor of it.


But that assumes infallibility and sensors that never, ever break. That’s unreasonable. To use the aviation parallel, every plane I ever flew with retractable gear did it via electricity or hydraulics, BUT they all had a “manual override” in the event of primary system failure. I never actually USED it, but there it was, with an inch of dust on it.

Similarly, it would be pointless hubirs NOT to include an “in case of emergency, break glass” set of manual controls for SHTF scenarios,even if rarely used.


Google “self driving car 2030” and see for yourself, Mike.


Yes I know the date…but that doesn’t mean it will happen. Nor does it mean that many many safety factors will be in place to elevate fears.


No it doesn’t. It assumes a safety factor that is greater than the current safety factor.

If you can get the sensors to 99% reliable, which is a lot more doable than 100% reliable, then you’ve vastly eclipsed the reliability of human drivers. Our “sensors” don’t tend to break while we’re driving, but we ignore them in favor of staring at phones, and then crash into things.

As I’ve said before, self-driving cars do not have to have a 0 mishap rate. They don’t even have to have a 0 death rate. They simply have to be better than humans are, and one commute home during rush hour will tell you that that’s a pretty low bar.

By the way, there are plenty of non-redundant systems on airplanes. Ever fly a single-engine? Well, when that thing quits, the backup is called an off-field landing. :wink:


No it doesn’t. Sensors don’t have to be infallible. Right now today most of the autonomous vehicles have redundant sensors. If one fails, the other takes over. And if there’s a catastrophic error, then the system will shutdown and move to safe spot.

The problem autonomous vehicles are having now is HUMANS. The autonomous vehicles obey ALL traffic laws. But humans don’t.

“One Google car, in a test in 2009, couldn’t get through a four-way stop because its sensors kept waiting for other (human) drivers to stop completely and let it go. The human drivers kept inching forward, looking for the advantage — paralyzing Google’s robot.”

There are bugs…there are problems…but not unsolvable. Years to go and the technology is changing extremely fast.


I didn’t deem “slippery slope” arguments as fallacious, it’s the lack of logic used in the argument that makes it fallacious.

I think you have your past and future mixed up. Precedent (also known as “stare decisis”) is reliance on an issue that has already been decided. A “slippery slope” argument makes an argument for a circumstance that hasn’t happened, and is unlikely to happen due to the fact that it’s not logically connected with the issue.

If, rather than make a logical “If A happens, B happens - If B happens, C happens - Therefore, if A happens, C will happen” argument, you instead insist that, “If A happens, Z will happen” without linking A to Z, your argument is fallacious, and I take no pleasure in “deeming” it fallacious. I’d rather we steer away from all logical fallacies and have a civilized debate of the issue.


So, at some point cars will be smart enough to drive themselves. It would seem that soon they will be able to diagnose themselves accurately enough such that current master mechanic job will go the way of the buggy whip manufacturer.


I just got back from “driving” to the clinic for my flu shot. They said I need to watch my blood pressure so I’m going to ignore the comments from a couple of you folks, no offense. For those of you asleep for the past 20 or so years, I would just suggest a little reading about the experiences in Eastern Europe after the war as the socialists and commies took over and do a compare/contrast in the good ole US of A. It’ll scare your hair straight.

We developed our own materials management and purchasing system and used to joke about naming it Hal. “Hal, stop the car please. Sorry can’t do that until we reach the pre-programed destination. But Hal the bridge is out. Sorry, I have been given no information on that. Hal!!”


Wasn’t Hal a fictional robot in a fictional movie? Are you sure you want to refer to a work of fiction as though it were a documentary?


People who are scared of and don’t understand technology always make those kind of analogies.

Those same type of analogies were made when the automobile was introduced some 100+ years ago.


Yep Hal was fiction.

I’m not scared of technology. I’m scared of the people developing it and profiting from it. Like “leave it to the experts, you people are just too ignorant and backward to appreciate what we are doing for you”.


Mike, you must watch Fox far more than I do to “recognize” quotes for them. My ideology was formed long before Fox News even existed.


Maybe it’ll be like back to the future with voice recognition. I don’t know how much of this is true but the old timers used to tell stories about when people first went from horses to cars. One was the guy that drove his car through the barn yelling “whoa” the whole time. So they’ll need to add that one to the software. Another one was the guy that had two doors put on his barn so he could drive right through and take another run at it again if he couldn’t get the car stopped the first time. I sympathize though and remember renting a bull dozer once though as as I was going down the hill discovered dozers have no brakes.


Sensor failure becomes much less of a problem if there are multiple sensors. Redundancy is frequently used where systems can’t be accessed for repairs, or the consequences of failure are dire. For self driving vehicles, three sensors could be used and polled many times a second. If one sensor shows a problem but the other two don’t, then the one sensing the problem might have failed. This would not trigger a reaction to the perceived problem, but a note to the owner to get the sensors checked. The systems would cost more, but mass production can bring down the cost dramatically.


Amazing how you all seem to follow the EXACT SAME ideology. It’s like Rush and his ditto-heads.


Believe it or not, some of us idiots know all about redundant systems, statistical odds, fail safe operations and so on. Still don’t insult my intelligence and real life experiences of having both generators and computers fail at the same time for whatever reason. Yeah been there. Maybe only 1%, but I’m experienced enough to know freak incidents happen. Like lightening a mile away traveling through the earth to an isolated computer ground. Doesn’t ever happen, right?

That’s why, just like a pilot, I will always insist a real person be at the controls as back-up no matter what. If you haven’t experienced a freak failure or anticipated one, you either aren’t very experienced or are not being truthful. Just quit trying to over sell the whole idea.


And the failure rate of these systems is far less then

. You falling asleep at the wheel
. Getting hit by a drunk or reckless driver.
. You taking your eye off the road the very second that child runs across the street in front of you.

And the list goes on and on and on and on…