Top Ten Reasons Self Driving Cars are Useless

It is time to tell it like it is. Self driving cars are practically useless. They are dangerously useless too. I have looked at the top ten major reasons they are useless and the case to be made is strong. I personally believe self driving cars are a scam, and we will see that they are so useless that you have to consider the scam angle when deciding where to put your money. The government should consider that as well. It is difficult to see how driverless cars are any more than a money grab and a con of immense magnitude.

The immensity of the scam is obvious when we see hundreds upon hundreds of articles supporting self driving cars. Self driving cars are a totalitarian concept. Totalitarian minds will do everything in their power to impose self driving cars on the populace. It is how they think, and it is similar to making cash obsolete. The totalitarian nature of both is easy to spot. Totalitarianism is not progress.

An example of this is the first reason why self driving cars are a problem:

1. They run red lights and when they do it is blamed on the driver for not taking over. But clearly when a driver has to take over at a critical juncture, it is much safer for the driver to always be attentive. The driver cannot always be attentive when the car is doing the driving. It is human nature to not be attentive. This is the fatal flaw of self driving cars, that drivers must intercept critical mistakes the cars make. If you have to be more alert driving a self driving car than when driving a manually driven car what on earth is the self driving car good for? Nothing, it is good for nothing.

Here are the levels of self driving cars. Level one and up are quite dangerous. Level three is asking for termination of the driver's life, in my opinion. And yet some exist like the Volvo S60 pictured below. Levels 4 and 5 cannot ever be attained based on the 10 reasons given in this article.


Volvo Level 3 S60 by Mariordo (Mario Roberto Duran Ortiz)

SAE automated vehicle classifications from Wikipedia article above:
  • Level 0: Automated system has no vehicle control, but may issue warnings.
  • Level 1: Driver must be ready to take control at any time. Automated system may include features such as Adaptive Cruise Control (ACC), Parking Assistance with automated steering, and Lane Keeping Assistance (LKA) Type II in any combination.
  • Level 2: The driver is obliged to detect objects and events and respond if the automated system fails to respond properly. The automated system executes accelerating, braking, and steering. The automated system can deactivate immediately upon takeover by the driver.
  • Level 3: Within known, limited environments (such as freeways), the driver can safely turn their attention away from driving tasks, but must still be prepared to take control when needed.
  • Level 4: As level 3, but no driver attention is required. Outside the limited environment the vehicle must be able to enter a safe fallback mode - i.e. park the car – if the driver does not retake control
  • Level 5: Other than setting the destination and starting the system, no human intervention is required. The automatic system can drive to any location where it is legal to drive and make its own decisions.

The Volvo S60 has good driving stats, but how serious are its drivers about handing over everything to the car and not paying attention? You can't test for that. And going into the city caused this driver  a managing Edmund's editor, a serious problem:

I was driving our 2015 Volvo S60 and as we approached the security arm at the entrance to our parking garage, the Volvo came to a loud halt.
The Auto Braking by City Safety kicked in even though I wasn't that close to the arm yet. Perhaps I was approaching it too quickly? Certainly, there was no danger of me hitting it. But it sure scared the bejesus out of Caroline and me. I almost bumped my face on the steering wheel. It was like a phantom stomped onto the brake pedal. And I actually had my foot on the brake and was slowing down at the time.
I understand how this could be helpful in a near-crash situation. But its reaction in this incident was overly cautious and not at all graceful.
Donna DeRosa, Managing Editor
The commenting on this posted article showed many were not concerned, saying that technology was new. But this is supposed to be an advanced level 3 car. Who is kidding whom? Would you buy a car that stops when it feels like doing so, at your peril? Apparently drivers in the US agree since there are fewer than 1000 on the road in the USA. And it will likely never catch on.

2. Human drivers bump into self driving cars that often are not doing the speed limit. From the site:

The glitch? They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well.
As the accidents have piled up -- all minor scrape-ups for now -- the arguments among programmers at places like Google Inc. and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble? 
But that is not alone the problem. No, they obey the law and often time don't just go the speed limit. No, they go below the speed limit. Society would grind to a halt if these were the only cars. GDP would crater. The cost would not be worth the roll out of inefficient and poorly performing cars. Truth is, their accident rates are twice regular cars. Drivers cannot deal with cars that are so timid but you will see that they can't be more bold either.

3. Self driving cars have difficulty watching out for motorcycles. From Cycle World  we find this information and warning:

FEMA suggests we look at US research by John F. Lenkeit of Dynamic Research, Torrance, CA, which finds that existing forward collision warning systems give “inadequate results” for motorcycles in 41 percent of test cases, versus under 4 percent for cars.
Self-driving vehicles have been promoted to us as having the potential to eliminate most or all of the more than 30,000 annual traffic deaths in the US. If such self-driving vehicles can fail to detect motorcycles, they can presumably fail to detect pedestrians as well. Then this question pops into mind: When a ball bounces into the street, does the autonomous car detect it? Does it then expect the child who may run after it?
As long as self driving inventors think they are playing a sophisticated video game, we cannot even consider self driving cars as being safe. The real world, unfortunately, is too high a hurdle for them to jump over.

4. People will not buy cars programmed to kill them. So when you have the choice of saving 10 pedestrians only by crashing into a wall and killing yourself, no one wants to buy that kind of car. They just want others to own that kind of car:

“[Participants] were not as confident that autonomous vehicles would be programmed that way in reality—and for a good reason: they actually wished others to cruise in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves,” conclude Bonnefon and co.And therein lies the paradox. People are in favor of cars that sacrifice the occupant to save other lives—as long they don’t have to drive one themselves.


From Business Insider we get 6 more reasons why self driving cars are fatally flawed:

5. Self driving cars have trouble with bridges. GPS is a big fail when it comes to bridges.

6. Self driving cars can't see in snow. Yes, Ford is improving the technology by not using just road lanes. But if you build something new, or put something different in the topography, even the Ford technology could be fooled.

7. Driverless cars can't drive without proper lane markings. That would make almost all of Las Vegas, where I live, off limits. The "modest" changes, as they say, needed for our infrastructure will cost billions of dollars, nationwide. I suppose the driverless car companies will want someone else to pay the bill. And what happens when they put new pipes into the ground and ruin the lane markings, or let them wear out?

8. Driverless cars don't do very well in cities. Truth is, there are two reasons for this. First, there are too many diversions to keep track of like potholes, cones, pedestrians, etc. etc. This is why you need human drivers. And have you ever used a GPS system and got lost? I have. Well, those are far from perfect. Apparently self driving cars have learned the hard way about the imperfection of Google Maps and the like. Why would Google want to develop a car that makes their maps look so lame? Maybe that is why Google changed its name to Alphabet and the Google car to Waymo. You can't call something headed for failure by the name of "Google".

Perhaps Google is more concerned about its brand than the safety of its slow, limited self driving cars. The Nissan researcher, Maarten Sierhuis, has said that the problem with cones and construction is that you will always need drivers and that self driving cars will never, ever be able to solve the issue of construction. Billions of dollars are likely being invested by companies interested in automated delivery vehicles, and yet this statement by the researcher crushes those expectations:

“We will always need the human in the loop” 

9. Self driving cars cannot figure out who has the right of way. There are many intersections that do not have a clearly distinct rule for right of way. Some intersections you have to be in the intersection first. Some intersections have a through road and side feeder roads. Some through roads are not easily identifiable as through roads. Some intersections require the person on the right to go. Sometimes it is the drivers acknowledging each other and keeping safe. Self driving cars have no judgement. They are only rule based. They are clueless when it comes to intersections.

10. Self driving cars cannot merge safely into traffic. Again, from Business Insider:

Dolan noted that when human drivers try to merge onto roads with cars traveling at higher speeds, they tend to inch forward to make sure it's ok. Often, people will pull out in front of traffic under the assumption that cars will slow down for the merge, he added.
But a driverless car probably wouldn't take that risk because if it projected the velocity of the upcoming car, it would pull back to avoid a crash, he said. 
As you can see, self driving cars are a menace to merging. They just can't do it. They aren't human.

Too Many Hurdles:

Since self driving cars have so many likely insurmountable hurdles to leap over, investing in them is also an engagement in massive speculation. If Chipotle had trouble getting fast food right, what makes anyone think something as complex as self driving cars will ever catch on? They would almost be funny, sort of Keystone Cop like, except that they would cause wrecks all over the place.

That self driving car that raced with the others until it could not recognize the wall and crashed, should be the poster car for wasting money on a useless invention.

I am certain that if you buy one of these cars, and anything goes wrong, all 3 of your friends and their friends will never consider buying one, and neither will you repeat your purchase if you are still alive. That doesn't sound so good to investors. Self driving cars are a bad idea. The sooner they are banned and shelved, the safer and better off we will all be.

Apparently automakers want you to buy added technology that you dare not use, but will pay dearly for. Smells like a scam to me.


This article was first published by me at Talkmarkets: http://www.talkmarkets.com/content/top-ten-reasons-self-driving-cars-are-useless?post=131571&uid=4798







Comments

Popular posts from this blog

Gary Anderson's Talkmarkets Articles by Subject

The Poison of Trump Comes From Murray Rothbard, Fascist Divider

John Mauldin Discusses What Could Go Wrong