An extensive international survey reveals distinct global preferences concerning the ethics built into the programming for autonomous vehicles.
The Massachusetts Institute of Technology survey involved more than 2 million online participants from more than 200 countries weighing in on versions of a classic ethical conundrum called the Trolley Problem (below, left).
This involves scenarios in which a vehicle accident is imminent, and the vehicle must opt for one of two or more potentially fatal options.
In the case of driverless cars, this might mean swerving toward a couple of people rather than a large group of bystanders.
Edmond Awad, a postdoctoral student at the MIT Media Lab, says the study is trying to understand the kinds of moral decisions driverless cars might have to make. “We don’t know yet how they should do that,” he says in a statement.
Awad says the survey identified three elements people seem to approve of the most. The most emphatic global preferences in the survey are for sparing the lives of humans over the lives of other animals; sparing the lives of many people rather than a few; and preserving the lives of the young, rather than older people.
“The main preferences were to some degree universally agreed upon,” Awad says.
However, the degree of agreement varies among different groups or countries.
The researchers found a less pronounced tendency to favor younger people, rather than the elderly, in what they defined as an “eastern” cluster of countries, including many in Asia.
The study involved researchers from MIT, Harvard, Canada’s University of British Columbia and France’s Toulouse School of Economics. They designed what they call Moral Machine, a multilingual online game in which participants could state their preferences concerning a series of dilemmas autonomous vehicles might face.
For instance, they were asked whether autonomous vehicles should spare the lives of law-abiding bystanders, or, alternately, lawbreaking pedestrians who might be jaywalking. Most people in the survey opted for the former.
In total, Moral Machine compiled nearly 40 million individual decisions. The researchers analyzed the data as a whole while also breaking participants into subgroups defined by age, education, gender, income and political and religious views. There were 491,921 respondents who offered demographic data.
The researchers didn’t find marked differences in moral preferences based on these demographic characteristics, but they did find larger clusters of moral preferences based on cultural and geographic affiliations.
They defined “western,” “eastern” and “southern” clusters of countries, and found some more pronounced variations along these lines.
Respondents in southern countries had a relatively stronger tendency to favor sparing young people rather than the elderly, especially compared with the eastern cluster.
Awad suggests acknowledging these types of preferences should be a basic part of informing public-sphere discussion of these issues.
In all regions, since there is a moderate preference for sparing law-abiding bystanders rather than jaywalkers, knowing these preferences could, in theory, inform the way software is written to control autonomous vehicles.
Beyond the results of the survey, Awad suggests, seeking public input about issues of innovation and public safety should continue to become a larger part of the dialogue surrounding autonomous vehicles.
“What we have tried to do in this project, and what I would hope becomes more common, is to create public engagement in these sorts of decisions,” he says.