# Why Self-Driving Cars Must Be Programmed to Kill



## Stelmaria (Sep 30, 2011)

The development of Self-driving cars brings interesting ethical dilemmas. The first is the question of legal liability - so far manufacturers have taken the step of accepting legal liability. I think this is a great move as it means they finally have a direct incentive to reduce the road toll (currently over 30,000 people per year in the USA and 1.3 million worldwide). But I would expect many legal challenges before that is all sorted out.

But there is another ethical problem and this is one that actually requires engineers to develop and program a strategy: the Trolley problem.

https://en.wikipedia.org/wiki/Trolley_problem



> Why Self-Driving Cars Must Be Programmed to Kill
> 
> Self-driving cars are already cruising the streets. But before they can become widespread, carmakers must solve an impossible ethical dilemma of algorithmic morality.


Why Self-Driving Cars Must Be Programmed to Kill | MIT Technology Review
Academic article: [1510.03346] Autonomous Vehicles Need Experimental Ethics: Are We Ready for Utilitarian Cars?


----------



## Metalize (Dec 18, 2014)

I remember someone mentioning this a while back. Opting to kill one instead of multiple or whatever the result of that algorithm would be.

Tough choice, but if they exhaust all other options, not sure what else they could do. Now reading the article and the possibility of a car swerving automatically to avoid hitting someone else, I'd be for it.


----------



## Stelmaria (Sep 30, 2011)

Metasentient said:


> Tough choice, but if they exhaust all other options, not sure what else they could do.


Well an answer is to design out much of the danger in the first place - lower speeds, different road designs, not drive in risky weather etc.


----------



## Metalize (Dec 18, 2014)

Snowy Leopard said:


> Well an answer is to design out much of the danger in the first place - lower speeds, different road designs, not drive in risky weather etc.


I edited my answer slightly after reading the article, but I'm in favor of this.

Yes, I agree, but I meant right within the process of that optimization problem -- if such a situation arises and every option that the car has to avoid complete loss of life, in a particular situation, has been extinguished, then I am not seeing a preferable option to simply choosing the least "harmful" option. They mentioned the issues in evaluating how the car would go about that, if we're not going purely by quantity -- but I'm not completely sure how this would be done. Evaluate the survival probability of impact on each person/group of people, by sizing up their physical parameters in real time? It's an interesting algorithmic issue.

On a related note, I could see the government forcing people to sell and purchase exclusively these types of cars, if it's true that they indeed have preferred rates of collisions/injuries.


----------



## Stelmaria (Sep 30, 2011)

Metasentient said:


> I edited my answer slightly after reading the article, but I'm in favor of this.
> 
> Yes, I agree, but I meant right within the process of that optimization problem -- if such a situation arises and every option that the car has to avoid complete loss of life, in a particular situation, has been extinguished, then I am not seeing a preferable option to simply choosing the least "harmful" option. They mentioned the issues in evaluating how the car would go about that, if we're not going purely by quantity -- but I'm not completely sure how this would be done. Evaluate the survival probability of impact on each person/group of people, by sizing up their physical parameters in real time? It's an interesting algorithmic issue.


Indeed



Metasentient said:


> On a related note, I could see the government forcing people to sell and purchase exclusively these types of cars, if it's true that they indeed have preferred rates of collisions/injuries.


Yes. But we seem to be a long way from there politically. People seem to accept thousands of people dying on the roads each year as normal, even though it one of the major causes of young people dying (besides suicide).


----------



## gestalt (Feb 15, 2011)

If a lemming walks into the road while I am driving there isn't the slightest chance I would be okay with my own car deciding to kill me instead of the lemming


----------



## Stelmaria (Sep 30, 2011)

gestalt said:


> If a lemming walks into the road while I am driving there isn't the slightest chance I would be okay with my own car deciding to kill me instead of the lemming


I'm not so sure you read the article...


----------



## gestalt (Feb 15, 2011)

"Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?" 

It should protect the occupant provided the occupant is driving on a road in a safe manner.

If the occupant were driving down a crowded city walkway then go ahead, smash him or her into the wall.

If a person decides to randomly run into the middle of the road without looking in either direction (a growing problem in my city) then the occupants safety should be prioritised

If the machine is driving the occupant and not the other way around, then the machine must take responsibility for the safety of the occupant. If the machine gets the occupant into trouble, then the machine should not kill the occupant as a response to it's own failure. And if a machine is driving an occupant in a safe and ordinary manner, and a pedestrian makes a decision to do something stupid, the pedestrian needs to take responsibility.

If machines and programs fail periodically we can not program them to take these sort of steps.

Basically, if a lemming decides to get in the way it's the lemmings fault.

... this conversation reminded me of a band : D


----------



## IDontThinkSo (Aug 24, 2011)

The trolley dilemma has already been solved. Consequentialists just don't understand the solution.


----------



## Grandmaster Yoda (Jan 18, 2014)

Well the car should kill the owner so no one has to lose money on the 10 people who died.


----------



## leftover crack (May 12, 2013)

Snowy Leopard said:


> lower speeds


What sort of lower speeds?
lower than the speed limit? Hell no.


----------



## Stelmaria (Sep 30, 2011)

Tea Pot said:


> What sort of lower speeds?
> lower than the speed limit? Hell no.


Just lower the speed limits where there are high risk of collisions.









Survivable Speeds | Greater Wellington Regional Council


----------



## Misaki (Feb 1, 2015)

Was reading about this the other day and found it quite interesting. The initial example is narrowly constrained, and a question crossed my mind: does it matter what the people crossing the street or whatever were doing? For instance, what if a group of people drunk out of their minds decide to jump onto the street, right in your path? I'd be curious to hear people's thoughts, and maybe some other more complex scenarios. Just to be clear: I'm not exactly saying that some people (like these hypothetical drunks) deserve to die, but real world possibilities that remove the "all things being equal" aspect seem to make these ethical dilemmas trickier.


----------



## RobynC (Jun 10, 2011)

Maybe this should be a reason not to develop self-driving cars: That way we don't have to produce robots that kill...


----------



## Stelmaria (Sep 30, 2011)

RobynC said:


> Maybe this should be a reason not to develop self-driving cars: That way we don't have to produce robots that kill...


That's a bit silly, considering the rate at which humans are killing humans right now.


----------



## Inveniet (Aug 21, 2009)

Hmm well cars don't run on tracks so it is kinda a flawed metaphor to begin with.


----------



## Carpentet810 (Nov 17, 2013)

Lets make the cars Equal opportunity that way everyone can be happy socialist style...


----------



## timeless (Mar 20, 2010)

Mow down the pedestrians. It's basically how I drive anyway.


----------



## RobynC (Jun 10, 2011)

Snowy Leopard said:


> That's a bit silly, considering the rate at which humans are killing humans right now.


That's not a good logical argument: "So humans kill humans, so we should create more entities that kill"


----------



## VinnieBob (Mar 24, 2014)

RobynC said:


> Maybe this should be a reason not to develop self-driving cars: That way we don't have to produce robots that kill...


this is necessary 
think of being a engineer on a train
there is a car stuck on rail road tracks with a sole occupant who cannot get out
further down the tracks [alternate] is a school bus which is also stuck on tracks
the engineer has 2 options since the velocity of the train exceeds stopping capability in time
if the engineer goes straight only 1 person dies
if she/he throws a switch to change tracks then all the occupants on the bus dies

the automated car must have such a program
in the long run this will save lives


----------

