The Waymo Tesla Self Driving Con

This article was first published by me on Talkmarkets: https://talkmarkets.com/content/technology/the-waymo-tesla-self-driving-con?post=176157&uid=4798

The Waymo Self Driving Con is gearing up. This is my strong opinion, but it appears to be backed up by fact. For example, Waymo CEO John Krafcik is not telling the truth when he said at a web summit in Lisbon:
Fully self-driving cars are here.
Even Tesla has on its website as of the time of this writing:

All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.
 But the truth is, these Teslas do not have fully self-driving capability. Recently one crashed into a fully stopped large vehicle after failing to brake. It was going 60 miles per hour. This is not indicative of full capability as a self driving vehicle! In my view, Krafcik and Elon Musk are behaving like confidence men ie. con men, in making claims that their technology is fully self driving when it clearly isn't fully self driving.

In this article, we can look into the technology that is required to really make a car self driving. I believe the reader will be shocked at the high bar that is necessary to achieve self driving lift off. Even multiple technologies and computer genius may not be enough.

Wired, which often gushes about self driving cars says Waymo is murky on its details about capability.

Waymo hasn’t disclosed how much territory its cars will cover or what kind of hours they will run, whether it will charge passengers for rides, or the timeline for announcing or figuring out any of that. (Company reps recently declined to give any such details for the existing early rider program in Phoenix.)
It’s also unclear if there’s a system for preventing a freaked-out passenger from clambering into the driver’s seat and grabbing control of the car—or how that would affect questions like insurance, if they then caused a crash. Waymo hasn’t shared clear plans for helping autonomous cars find the people they’re supposed to be picking up—which you know is a problem if you’ve ever been on the phone with an Uber driver, insisting I’m right here on the dot!
Who pays for the insurance and who takes on the liability is also not resolved. And of course, how fast do these cars go? Audi has a car that can rip off distances at 37 miles per hour hands free. Wow. An A8 speeding down the road at 37 miles per hour will have mass appeal. Not!

Waymo likely does not go at a reasonable speed either, but it is hard to find information on the secretive Waymo. So, everyone puts Waymo ahead in the self driving wars because it gives up little information. Talk about misplaced and mispriced capital!

Creative Commons 2.0 Generic modified by Mariordo


Multiple Technologies Required

Truth is, as we learned from a linked article on Wired, cars really need multiple technologies to be able to self drive, and then they still likely won't work on freeways. Lidar costs $75,000 and by itself it doesn't give the whole picture. It will be impossible to build a self driving car that most can afford.

And here is the real fly in the ointment. The computer must process multiple censors at once! These self driving folks have simply bitten off more than they can chew. These cars have trouble with blind spots, ambulances, spotting balls, cones and more. And they lack thoughtfulness.

Self driving cars have no ability to judge when it is time to be aggressive or when it is time to be conservative and cautious. There is no judgement in artificial intelligence. Brian Salesky of Ford's Argo AI spilled the beans about self driving technology:

Developing a system that can be manufactured and deployed at scale with cost-effective, maintainable hardware is… challenging.
Self Driving Cars Cannot Think

Worse yet, Salesky wrote in Wired that AI has to take on thoughtfulness in order for self driving cars to really work. That is an admission, in my opinion, of total failure. Thoughtfulness, the ability to think, cannot be engineered into these AI products. Once all the technology is assembled it is still not enough. More is needed. Salesky makes a statement that, while a truthful confession, should only cause self driving technology to be discarded. This autonomous effort is simply a pursuit of the impossible:

Once an autonomous vehicle has the tools to “see” relevant objects around it, it’s up to the car itself to take the next step — identifying the type of object, whether it’s a pedestrian, cyclist, another vehicle or debris on the road, and how fast that object is moving. The car then must make a determination about that object’s likely behavior.

If people get hurt or killed by these self driving cars, the government officials and CEO's should be held responsible for murder. That is a crime, unless, apparently, you get clearance for technology. It is one thing for a person to sign a waver absolving self driving gurus of crimes. It is quite another thing when bike riders, children, and other drivers are harmed. It is time for legislation to stop this risky behavior. Autopilot was on in the Tesla crash, according to the driver. At 60 miles per hour crashing without breaking into a stationary vehicle does not indicate fully self driving capability! Good thing it wasn't a private car with children in the back seat.

So, the CEO of an AI company has warned the deciders that the cars are not capable of thought and therefore are dangerous. They are without excuse and punishment for harm should go beyond civil penalty. And Salesky went on to say that human drivers are very good at what they do. In a punch to the gut of self driving technology, the CEO says:

When we drive a car today, we’re subconsciously estimating the next few seconds of behavior from other road users — anticipating when a pedestrian might jaywalk or when another car may be about to cut us off. Attentive drivers are incredibly good at reacting in these situations — managing their speed and planning out contingencies to adapt to anomalous behavior from others. 
Self driving cars cannot do this, and are therefore useless, as I have said. Waymo, the leader, appears to be, from my perspective, as thoughtless as the technology itself. And Tesla has proven to be close to a laughingstock in this arena, albeit with tragedy attached.

The problem of course is that if you have brilliant scientists and engineers who are also con artists, it becomes very difficult for the average person like me or you to discern the weakness in their con. They fall back on science.

But selling technology that becomes a useless backup, raising the price of the vehicle, may very well be the end game to this con. 

Comments

Popular posts from this blog

The Poison of Trump Comes From Murray Rothbard, Fascist Divider

Learn Economics

John Mauldin Discusses What Could Go Wrong