FYI.

This story is over 5 years old.

Tech

Self-Driving Cars Will Soon Be Programmed to Decide if They Should Kill You

Should the cars be hardwired to save the most lives in case of accident, even if that means driving itself off a cliff to avoid a group of school kids?

Google self-driving car. Photo via Flickr user Becky Stern

Before we can relinquish complete control of our ability to drive to computers, some scientists are working to get self-driving cars programmed with a moral compass to ensure it can navigate moral dilemmas like who to save and who to run over in the event of an accident.

According to ABC News, international researchers have been surveying the public on the best moral codes for driverless cars—and things got pretty dystopian pretty quickly.

Advertisement

The survey, published in Science, asked questions like: Should cars be programmed to save the most lives in case of accident, even if that means driving itself off a cliff to avoid a group of school kids? And what about pedestrians who are jaywalking—should the car crash to avoid them, even though the pedestrians are breaking the law?

Overall, people were cool with saving the highest number of people in the suggested scenarios, but if saving the most people meant putting the driver in danger, people obviously did not want to be driving the car.

"Most people want to live in a world where cars will minimize casualties," MIT professor and the study's co-author Iyad Rahwan told Gizmodo. "But everybody wants their own car to protect them at all costs."

It's a conflict that passengers and automobile developers are going to have to come to terms with if we're going to live in a world without drivers licenses. But by then, robots will pretty much run things and occupy the bulk of the workforce so we'll have a lot of free time to discuss the Trolley Problem by then.

Watch a Google Self-Driving Car Sideswipe a City Bus