I read an article in this morning’s paper concerning autonomous trucking. A Pittsburg company is ready to deploy autonomous trucks in Texas as a highway test of operation. In their opinion there are fewer things that might go wrong on the highway than in town with autonomous taxis. We’ll see how it works out. Here’s a link to the AP story.
Highay point to point is much easier than within a city… however. Hopefully the company has studied semi accidents for the last 20 years or so. I am thinking of the automomous taxi that initially stopped and then ran over a pedestrian knocked in front of it by another car.
What does the truck do when; A driver drives under the trailer? Into the rear? Changes lanes in close in front of the semi and then jams on the brakes?
We have all seen severe stupid accidents caused by car drivers with semis. What the autonomous semi does in reaction when this happens is important.
All the more reason to stop calling these things “autonomous.” They should be called “human-programmed.” Human decision-making remains the basis for the programs.
To me autonomous means that there is no human in the loop during operations. To start, programming is by humans but over time programming is probably AI generated and it appears that humans will keep score on performance. I guess we’ll see how it works out.
To me, since the humans did the programming, it means they ARE the loop.
You just mean that there is no situated human decision-maker on a moment by moment basis of the operation. But the thing is still doing nothing but human decision making.
I know that you didn’t invent the term. Its use is just the norm these days. And I think it’s a problem.
Of course, the machine learning part of the AI does start to make things murky. Thus…I fear the Terminator.
Yeah, AI is a lot more than programming in human behaviors.
Might work.