π

Self-Driving Cars Kill People

Show Sidebar

Update 2018-03-29: Settlement and deactivated anti-collision mechanism

In Tempe/USA, a self-driving car killed a woman. The police published videos of the accident:

Tempe Police Vehicular Crimes Unit is actively investigating
the details of this incident that occurred on March 18th. We will provide updated information regarding the investigation once it is available. pic.twitter.com/2dVP72TziQ

— Tempe Police (@TempePolice) March 21, 2018

This is my opinion on this accident and on self-driving cars killing more people in general.

The Tempe Accident

After watching the video, I have to say that probably no human driver could have avoided this accident. The poor woman crossed the street during the night without any light (on herself or from the street light) and did this without even watching the traffic. This could be almost considered as a suicide.

However, there was no human piloting the car. It was a self-driving car by Uber. They are equipped with "360 degree cameras, lasers, and radars".

In this case, the car should have recognised the woman as some kind of object that is clearly on a collision course with the path of the car. For lasers and radars, there is no such thing as darkness because of missing street light. They see as clearly as in a daylight situation.

Update 2018-03-29: The car that was adapted by Uber was a Volvo XC90. According to recent news, Uber disabled a built-in safety feature that could have provented the collision or at least could have initiated an emergency break. What a bummer.

Considering this, a very basic software or hardware faulure caused the death of this woman. Testing a malfunctioning car like that on public ground within a city is unjustifyable. More people will get killed.

Update 2018-03-29: meanwhile, Uber reaches settlement with family of victim. Both parties keep silent over the details. I guess there was enough money for a decent compensation. And Uber was fast enough to settle with the family before more details went public such as the on-purpose disablement of safety features.

Where To Test Self-Driving Cars

European self-driving car researchers use a different approach for testing self-driving cars. This technology is clearly not in the state of being able to handle typical every-day situations. Not yet. And I, personally, am very skeptical that self-driving cars will be there within the next five to probably ten years.

Therefore, self-driving cars are tested only on certain European highways only. The situation there is quite homogenous compared to the situation in cities, where there are too many different players: pedestrians, bicyclists, traffic lights, public transportation, and so forth.

Self-Driving Cars Will Kill More People

As a matter of fact, it is easy to derive that unless all cars are self-driving ones, we are going to see more traffic accidents similar to this one. Humans can not take over critical situations as fast as needed. Especially when they are bored while the car is driving. The self-driving cars can not be programmed for any possible situation there is. Therefore, there will be a huge amount of situations the computer can not act on properly.

Uber probably have included a routine to detect bicyclists like the poor woman. They probably have not thought of recognizing a bicycle loaded with lots of bags as a bicycle. This probably might have caused the car not to recognize the object in general. We'll see. I am curious whether or not we are going to read the detailed error report on this situation.

Computers need to recognize and handle situations with airplanes doing an emergency landing on a street. Or people spontaneously changing directions. Or falling advertisements or road signs. Or all kinds of animals on the street. Or huge amounts of water blocking sight on the street. Or wrong-way drivers. Or maliciously hacked street signs. Or outdated map data of all kind. Or getting in a car chase. Or driving in a city while there is a power outage. And this list can be continued forever.

Human actors and self-driving cars in combination are much worse than humans alone without self-driving cars. Only when the human factor is eliminated completely in a traffic situation, self-driving cars can work properly and therefore reduce traffic accidents by a large degree.

This situation is hard to get to. We would have to ban non-self-driving cars and forbid bicyclists and pedestrians where self-driving cars are operated. A trade-off could be that self-driving cars are only allowed on highways but not within inhabitated areas.

If we will face situations where only self-driving cars are operated, we could remove all those ugly traffic signs, traffic lights, and even road markings. Most parking lots can be used for humans again instead being occupied by waiting cars. At least a few positive aspects on this topic.

Comments

Mario wrote on Twitter:

Small disagreement: a driver would have avoided this collision by either turning on hi-beams *or* going slower.
Otherwise the driver would be responsible for the collision in that situation.
The pedestrian didn’t teleport onto the street, rather walked with a rather low pace.

— Mario Landgraf (@sirlanda) March 23, 2018

Fair point on my comment that a human driver could not have prevented the accident. I don't know why the car did not use the full or upper beam.

Comment via email or via Disqus comments below: