This story is over 5 years old.


Driverless Cars Should Spare Young People Over Old in Unavoidable Accidents, Massive Survey Finds

In the Moral Machine Experiment, a survey of more than two million people from 233 countries, people preferred to save humans over animals, young over old, and more people over fewer.
A driverless car scans the area for pedestrians.
Image: Shutterstock

If you were behind the wheel of a car about to crash, would you rather kill a bunch of cats or some kids? Would you choose to kill yourself and your passengers by crashing into a concrete median, or would you rather kill several elderly people crossing the street?

What if, instead of elderly, it was a pregnant woman and a child? Or two criminals? Or three homeless people? Or a couple of business executives?


These are precisely the questions that researchers, led by MIT postdoc Edmond Awad, tried to answer in their new paper, “The Moral Machine Experiment,” published Wednesday in Nature. The purpose of the study was to understand how people think autonomous cars should decide who to kill if and when the need arises. Their research is based on the responses of more than two million survey participants across 233 countries.

Awad’s team found that people generally preferred to save humans over animals, young over old, and more people over fewer. There were, however, some cultural differences about who to save first.

People living in Latin American countries preferred saving young people over older ones, for example, while the opposite was true for respondents from Asian countries. Most people, however, would rather spare pedestrians over passengers, as well as lawful people over jaywalkers—except in poorer countries, where drivers are generally more tolerant of jaywalking.

A graph showing people's preference for groups that a driverless car should spare in a car accident.

The respondents’ answers were recorded by the Moral Machine, a gamified survey designed in 2016 by Awad’s team to understand our most basic killer instincts.

The Moral Machine game is similar to the infamous trolley problem (a.k.a. would you kill one person to save five?), but calibrated for the autonomous car. Azim Shariff, a co-author on the study and the Canada research chair in moral psychology at University of British Columbia, said in an interview that most drivers rely on gut reactions in dangerous scenarios like these, but that autonomous cars will have the luxury of deliberation.


“They’ll make decisions that redistribute risks to different people on the road,” Shariff said.

Participants were sucked in by a moral dilemma that is increasingly popping up more as automation becomes more central to our lives: How much power do we want to give to the machines? These real-life manifestations of the trolley problem—however rare they might actually be—home in on a very persistent fear people have of making life-or-death decisions.

“It does seem like 2018 was a tipping point where people turned against emerging technologies,” Shariff said. “[The autonomous car] might be the first consumer product that could be programmed to the put the life of the owner at risk deliberately and against their will.”

The moral dilemma is particularly pronounced with cars, Shariff said, because we spend so much of our time in them, and money on them.

Read More: Uber Self-Driving Car Kills Arizona Woman, the First Pedestrian Death By Autonomous Car

Plus, nearly anyone who is a regular driver has encountered a scenario in which they have had to make a risky split-second decision. In 2010, Quebec woman Emma Czornobaj stopped her car on the highway to save a family of ducks, killing two people on a motorcycle in the process.

Shariff thinks the research represents “the largest moral psychology study ever conducted.” The Moral Machine game went viral after VICE and others published stories about it, and it has recorded more than 40 million decisions since it launched in 2016.

“Obama talked about it once. That helped. Then we had a bunch of these YouTubers who posted themselves playing the game,” Shariff told me on the phone. “One video-gaming review couple spent an hour and a half going through the 13 scenarios.”

He and his co-authors hope their research informs regulatory agencies and auto manufacturers about how to broach this ethical problem—and why it’s essential to include the public in the conversation.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.