Tesla Slams Into Stationary Cop Car: Is It Time to Ban Autonomous Driving?

 | May 30, 2018 | 10:15 AM EDT
  • Comment
  • Print Print
  • Print
Stock quotes in this article:

tsla

,

googl

On a Tuesday morning on Laguna Canyon Road, a Tesla (TSLA) Sedan in autopilot mode slammed into a parked cop SUV, causing minor injuries for the Tesla driver and damaging both vehicles, according to a report from Laguna Beach, California Policy Department. The police car was unoccupied.

This is the latest in a series of accidents involving Tesla's Model S and Model X in recent months. The broader issue goes well beyond Tesla's Autopilot system, and concerns all automakers offering driverless systems.

Back in October 2016, Elon Musk had promised all future Tesla models would have updated software with Level 5 capability. "Basic news is that all cars exiting the factory have hardware necessary for Level 5 Autonomy so that's in terms of Cameras, Compute Power, it's in every car we make on the order 2,000 cars a week are shipping now with Level 5 literally meaning hardware capable of full self-driving for driver-less capability," he said at the time, when he introduced Autopilot 2.0 for Tesla Model S and X.

Tesla's Autopilot today is clearly not a driverless system. It's not even a hands-off system.

"When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times," the company said in a statement to Reuters in response to today's crash. "Tesla has always been clear that Autopilot doesn't make the car impervious to all accidents."

This isn't solely about Tesla. Almost all automakers are working on similar driverless systems. We call autonomous systems "Level 5", as that's the point where the cars will not require any person behind the wheel. Another word used for this Level 5 autonomous capability is "Robotaxi", which is self-explanatory.

Some automakers are going straight to Level 5, skipping Level 3 and similar "hands-off" systems. Waymo, owned by Alphabet/Google (GOOGL) , for example, has said that they don't want to bother with anything less than Level 5 except for testing purposes.

Almost all other automakers are working on getting to Level 5, and the debate is mostly whether this will take six months, six years, or closer to six decades to get there.

However, few people disagree with the notion that we'll get there eventually, at least on a technical level.

So everyone who bought a Model S or X since October 2016 will eventually be able to simply step out of the car, and the car will go run your errands for you.

All it will take is a software download, for which Tesla has already charged some of its customers thousands of dollars per person while they await Tesla's updates.

If several automakers -- including Tesla -- are right about their vision for Autopilot (Level 5) autonomy, sometime soon you will be walking down the street and see an increasing number of models in the lanes with no one behind the steering wheel.

Now imagine a terrorist or a hostile foreign power entering this scenario.

Yes, in a world that is rapidly approaching, every car and truck will become a weapon. Whether you are hostile power, a terrorist organization or a random lone hacker, there is going to be an increasing risk of an outside party taking control of an existing autonomous car, or building their own.

If you are a foreign power or a financially resourceful terrorist organization, you could make this vehicle look like any other car on the road. For a terrorist organization, their main problem is a shortage of people to make it happen. They need to get people into the country, who will then drive the vehicle.

What if they could instead sit in a remote location on the other side of the planet, and simply have this truck or car do the driving remotely, onto its U.S. target?

In today's case, the Tesla rammed into a stationary cop car. The person behind the wheel did not have malicious intent.

If these sorts of things can happen today without any known malice, just imagine the societal mayhem and chaos that will ensue when driverless vehicles and other autonomous technologies come under the control of hostile foreign powers, terrorist organizations and evil hackers? I can't think of a much greater national security risk.

Tesla cars hitting firetrucks and cop cars may not yet qualify as a national emergency. It's still early days for Tesla's Autopilot system, but its recurring incidents are enough cause for concern.

Before we see more models from Tesla and other automakers on the road, I believe it is high time for the government to step in and regulate or ban all forms of autonomous driving technology, until regulatory provisions are in place that protect pedestrians and other drivers on the road.

Columnist Conversations

Although we did see an initial rally off my original CXO setup, there has been no follow through.  Some w...
on Apr 30 right here in CC, I mentioned buying the Spotify Jul 160 call. It's time to move. We have a nice ...
Some say the Diamonds are a girl's best friend. It might be a trader's best friend as our bounce zone is gett...
Today we'll be discussing a couple of important technical tools, the RSI and money flow.  Join me after t...

BEST IDEAS

REAL MONEY'S BEST IDEAS

News Breaks

Powered by
Except as otherwise indicated, quotes are delayed. Quotes delayed at least 20 minutes for all exchanges. Market Data & Company fundamental data provided by FactSet. Earnings and ratings provided by Zacks. Mutual fund data provided by Valueline. ETF data provided by Lipper. Powered and implemented by FactSet Digital Solutions Group.


TheStreet Ratings updates stock ratings daily. However, if no rating change occurs, the data on this page does not update. The data does update after 90 days if no rating change occurs within that time period.

FactSet calculates the Market Cap for the basic symbol to include common shares only. Year-to-date mutual fund returns are calculated on a monthly basis by Value Line and posted mid-month.