To Swerve or Not to Swerve? The Ethics of Self-Driving Cars

autonomous car ethicsMost drivers today have a pretty clear grasp of the rules of the road, even if they sometimes mess up. Obey traffic laws. Yield to pedestrians. Try not to hit people, or puppies.

But the moral code we take for granted gets a lot more complicated when computers make the decisions. Self-driving car technology is programmed to follow the law, but sometimes common sense, or a driver’s conscience, dictates disobeying.

A team at MIT invented something called the Moral Machine, an online platform that asks users to help a driverless car choose the lesser of two evils: killing three passengers or three pedestrians, for instance. Go ahead and try it; it’s interesting, although not exactly easy.

Here are a few of the ethical dilemmas autonomous cars will soon have to face.

What do you think they should do?

Should Self-Driving Cars Protect Passengers Over Pedestrians?

A cement truck overturns in front of a driverless car carrying two passengers. The car’s forced it to make a choice: slam into the truck, killing its passengers, or swerve into another lane, striking another driverless car and killing its three passengers.

This conundrum illustrates the difference between “self-protective” autonomous cars and “utilitarian” autonomous cars. “Self-protective” self-driving cars are programmed to protect their passengers. In contrast, “utilitarian” autonomous cars are programmed to reduce casualties in a crash. When researchers asked people to consider scenarios involving these two approaches, something interesting happened: “A large majority of our respondents agreed that cars that impartially minimized overall casualties were more ethical, and were the type they would like to see on the road. But most people also indicated that they would refuse to purchase such a car, expressing a strong preference for buying the self-protective one. In other words, people refused to buy the car they found to be more ethical.”

Translation: We’re all out for ourselves.

Should Self-Driving Cars Assign Value to Human Lives?

A school bus filled with children stalls in the road, directly in front of a driverless car. The car could swerve, but then it would hit a bus filled with prison inmates in the other lane. Should it swerve?

self-driving car ethicsIn a hypothetical situation where a driverless car must kill one person to save another, is it ethical to value a child’s life more highly than that of an elderly person? Or a doctor’s life more than a felon’s? It may sound icky, but already we place varying values on human life in situations like organ transplant waiting lists and wrongful-death lawsuits. Your life is worth right around $9 million, according to some estimates.

Here’s a possible scenario: What if driverless cars used data on past settlements from car crashes to determine how much space they gave pedestrians? On the surface, it might make sense. If historically there has been a high incidence of costly crashes in a particular neighborhood, why not give pedestrians an extra foot of space? Well, this also could mean that driverless cars would be less careful in poorer neighborhoods, where people have accepted smaller settlements for injuries. “The algorithm would then inadvertently penalize the poor by providing them smaller buffers and slightly increasing their risk of being hit when out for a walk,” research scientist Noah Goodall points out.

Should people continue driving themselves when driverless cars can save so many lives?


After thinking about all these scary scenarios, you decide you’re never going to entrust your life to a computer-controlled car. It’s just too risky. Then, as you’re driving on a rainy night, you fail to check your blind spot. You hit another car while changing lanes and seriously injure the driver. Was it unethical to keep driving yourself around?    
ethics of autonomous cars

Did you know that around 94 percent of car crashes are caused by driver error, according to the National Highway Traffic Safety Administration? It’s not that drivers should always be blamed for accidents, the NHTSA points out. But most crashes are caused by recognition errors (distractions, not paying attention) and decision errors (driving too fast, misjudging other drivers’ actions).

As self-driving car technology improves, it’s almost a sure bet that computers will be safer drivers than humans. One prediction says crashes could be decreased by 90 percent, saving many thousands of lives and billions in medical bills. It may eventually become unethical to keep your hands on the steering wheel, even if you don’t trust autonomous cars.

It’s a crazy world out there. Luckily, you can count on Compare.com to help you with one thing: saving money on your auto insurance. Take a few minutes to enter your information, and you can get multiple free quotes from major insurers. It’s that easy. Try it today!

Share This

You could save up to 32% by using Compare.com!

Based on a survey of 100 California Residents. Average savings determined via a comparison of their selected policy against their self-reported annual premium.