Government to investigate why driverless car on autopilot got in fatal accident

Headshot image of Dan Calabrese
Published by: Dan Calabrese on Friday July 01st, 2016

Here's a thought: Someone should have been driving?

I can't be the only person in the world who is skeptical of the idea of "driverless cars," but I'll admit I've been very surprised by the lack of skepticism I hear from others. Cars are not airplanes, which can go on autopilot for lengthy periods because at 40,000 feet there are no tractor trailers to pull out from behind a blind corner, nor is there someone just ahead of you to slam on their brakes.

But the automotive world, which has been known to have a bad idea or two, is all in with the idea that cars can be put on autopilot - taking the job of steering and braking away from drivers. The driverless car! What could go wrong?

This:

A fatal accident in which the driver of a Tesla Motors Inc (TSLA.O) Model S car operating in Autopilot mode was killed in a collision with a truck has prompted an investigation by federal highway safety regulators, the U.S. government and Tesla disclosed on Thursday.

The investigation of the first known fatality to involve a Model S operating on Autopilot comes as Tesla and other automakers are gearing up to offer systems that allow vehicles to pilot themselves under certain conditions across a wide range of vehicles over the next several years.

The National Highway Traffic Safety Administration said it is investigating 25,000 Model S sedans that are equipped with the Autopilot system.

The accident, which according to a report from the Florida Highway Patrol killed 40-year-old Joshua Brown on a clear, dry roadway on May 7 in Williston, Florida, will add fuel to a debate within the auto industry and in legal circles over the safety of systems that take partial control of steering and braking from drivers.

The NHTSA said preliminary reports indicate the crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection.

Luxury electric car maker Tesla said in a blogpost on Thursday that "neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

If your autopilot system can't distinguish the white side of a tractor trailer from a brightly lit sky, I have a question, and it's not, "How do you fix that problem?" The question is why anyone would think it's a good idea to trust a computer program to drive your car for you. The solution to the problem is that the human being who knows the difference between a truck and the sky drives the car.

I file this under the category of just-because-you-can-built-it-doesn't-mean-you-should. Why does anyone need a car that drives itself? I don't want to sound like one of those people from 50 years ago wondering why anyone would need their own personal computer. Obviously a given generation can only guess at the perspective of a future generation and the way their technology might be used. But when a computer malfunctions, it doesn't kill you. When a driverless car malfunctions, it very well might, and it did kill this guy.

Today personal computers and mobile devices are ubiquitous, and yet we still get blue screens of death and home screen freezeups. The technology is impressive and highly useful, but it's not foolproof. So why would anyone think a technology could be developed that would be so foolproof that it could drive your car for you, and you could literally trust your life with it?

Even if it's theoretically possible, why does it need to happen? What's so hard about driving your own car? If you think you need a driverless car because you can't stop texting and checking Facebook when you're behind the wheel, I'm going to suggest that technology is your problem, not your solution.

Get your copy of Herman Cain’s new book, The Right Problems Solutions, here!