Tesla's "Full Self-Driving" technology

:roll_eyes:

1 Like

You have to wonder if there is a legal adviser that can tell Musk that his systems are not ready for prime time or does he just not care if his vehicle’s kill people .

2 Likes

I don’t know why Musk is pushing this. SAE and IEEE both have their own standards which the Federal Government and state governments are following to determine when and if autonomous vehicles will be available. Not sure why Musk is not wanting to follow those standards.

I’ not going to buy one unless it meets these standards. And I don’t know of any state that will allow them to be sold unless they follow these standards.

1 Like

IMHO it’s possible no vehicle will ever be able to have a safe auto-pilot. For example, my Jeep has lane departure assist (warning and active steering). However, at least in my car, the sensors that operate this function use lane markings, including those at the side of the road, to determine where the “lane” is. Many roads do not have a painted line demarking the shoulder, so the system in my car cannot function on those roads. In this case, it’s a highway infrastructure issue, not a vehicle software/hardware issue. And given that many of the roads that lack this striping are rural, and often winding (which makes lane departure that much more important), it will take years, if ever, to get every single highway and road in the U.S. (forget the rest of the world) properly marked for this to work. Maybe other makes use different technology for this function; maybe they can be made to work for this.

2 Likes

But this is not the only way to do this. Carnegie Mellon University developed software they demonstrated in the early 90s that could determine the lane without lane lines by using a camera to look for the road edges or the dark trail in the lane center instead if lane lines were not available. This was demonstrated with a cross-country drive to an autonomous car demo in San Diego CA in 1995.

2 Likes

No one is even attempting to use full self driving technology on snow and ice covered roads which we frequently drive on 5 months of the year in the northeast.

2 Likes

If Tesla can make their system the standard, then they have a huge advantage over those that don’t have it. It is potentially a very lucrative additional business opportunity. Look what happened when Microsoft talked IBM into using their operating system as a standard. Lots of computer hardware systems have benefitted the originators when their product design became the standard too.

Thanks. That would definitely be an improvement over the system my Jeep uses, although, in the end, being an old curmudgeonly type, I’ll probably never accept completely enabled self-driving cars.

2 Likes

That makes two of us

2 Likes

It’s hard for me to imagine that self driving software would be able to handle all the types of road situations the average driver encounters. Like an adventurous baby who escaped from their mom & is now crawling across a 4 lane road, stuff like that would be hard for computer software engineers to anticipate. The software engineers who developed the software for the 737 max passenger jet ran into a few problems themselves, and there’s not much chance of a baby crawling across the sky.

does that happen a lot by you. LOL kidding

Not often, but I’ve seen it before.

I’d imagine one time would be too many.

1 Like

In this case Tesla isn’t going to be making the standard. SAE and IEEE are independent entities that don’t have a stake in the game. Federal and state governments aren’t going to let one company without independent verification determine when a vehicle is ready for prime time.

IBM sought out Microsoft for and operating system for their new PC. As part of the agreement Microsoft was able to license their operating system separately from the PC. At the time IBM (wrongfully) thought that the money was going to be in the hardware (NOT software). The writing was already on the wall. Hardware was getting cheaper and software was getting more expensive. It was only a matter of time when they’d cross.

And no matter what happened with IBM and Microsoft - the end product didn’t have the potential of injuring/killing people because of a software bug or hardware glitch.

1 Like

There are other instances where there were competing standards early in the development phase. I’m fully aware of SAE and IEEE. I participate in ASTM and IEST, and know about developing standards as a collaboration between government and industry.

This is certainly about standards… What MUST the system do and detect. Standards often don’t specify HOW to do it but what targets must be met. Well, the best standards, IMHO, no standard should stifle innovation by directing the technology to be used. That’s how we got sealed beam headlights for decades when better technologies were created. Just set the high bar for safety and reliability and let the innovators figure the best way.

The challenge becomes detection and much of this discussion centers around both software but also sensing technologies. 3D scanning lasers are great devices… that can’t differentiate between a rain shower and a brick wall. Snow looks like solid concrete. Radar can, but is a wide scan view with very limited resolution other than distance. Sonar sensors, like the parking sensor spots you see on bumpers, are similar to radar with a much shorter range. Cameras offer single lens 2D views, or better 2 camera 3D views but require massive computations at very high speed to resolve the data. GPS radios give you a plus or minus 5 meter location unless it is cloudy or high buildings block the signals. Accelerometers with steering sensors with ABS wheel sensors can give inch by inch changes in direction from one spot to another but lose their location if a tire slips.

All of them will be required to work together feeding a robust algorithm loaded with checks and re-checks to make sure it all makes sense to make autonomous vehicles work in a random world. It is a very difficult task.

1 Like

Collaborations between gov’t and industry regarding safety issues has far from a perfect track record. The Turkish Airlines DC10 crashing on take off from Paris for example.

You just said the world isn’t perfect. I think we all know that.

3 Likes

Fortunately, perfect is not the goal with vehicle autopilots. 38,680 people died last year in traffic fatalities - a rise over previous years despite less traffic due to the pandemic. 2019’s death toll was 33,244. That tells us a couple of interesting things.

First, it tells us that if we can come up with an autopilot system that is universally adopted and which causes fewer than 30,000 deaths per year, we save thousands of people every year.

Second, it tells us that humans are, in the aggregate, idiots who will see less traffic and react by driving even faster and more dangerously than we already do. That system that saves thousands per year will not fall prey to human foibles like showing off, getting impatient, getting distracted, getting drunk, or any of the other routine stupidities that happen daily on our roads. So not only will we save thousands, but the AI death rate will not increase because of unusual world events or even just overdoing it at bars.

AI driving does not need to be perfect. It only needs to be better to be worth adopting.

I think the real trick is going to be that at some point, the autopilots will encounter problems they cannot correct for (dirty sensors, unplowed roads, whatever) and for which the proper action is to turn the car back over to manual control. But if people are used to letting the computer drive, their driving skills will atrophy, so they may not be able to handle it.

In the aviation world, there are examples of this happening. The Asiana Airlines 777 crash at San Francisco Airport was in large part caused by this problem - that airline overrelied on the technology to fly the plane, and so when the technology didn’t work because the instrument landing system at the airport was down for repair, the pilot took over but wasn’t proficient in hand-flying anymore, if he ever was, and ended up crashing short of the runway.

I think we’re going to need to come up with a new driver licensing system which involves periodic currency requirements just like pilots have to comply with. Failure to maintain currency means the car pulls over to the side and stops, and you have to call a cab (or perhaps this will be a business opportunity for professional drivers to parachute in and drive your car for you).

1 Like

That’s exactly what the IEEE and SAE standards do. They have a list of what constitutes each level of autonomous driving vehicle. For truly autonomous driving vehicle the recommended standard is level 5. Some vehicles are at level 3…most are still level 2.

It is a difficult task…and believe it or not well over 90% of what you said is there in place and working. Snow and ice are the worse problem they faced so far because the sensors would get covered up. So heaters were added to the sensors to keep them clean.

As I’ve stated MULTUPLE times in this forum this technology is changing constantly. On average over 1,000,000 lines of code are added or changed every few months.