Literature DB >> 30157299

How Should Autonomous Cars Drive? A Preference for Defaults in Moral Judgments Under Risk and Uncertainty.

Björn Meder1,2, Nadine Fleischhut3, Nina-Carolin Krumnau2, Michael R Waldmann4.   

Abstract

Autonomous vehicles (AVs) promise to make traffic safer, but their societal integration poses ethical challenges. What behavior of AVs is morally acceptable in critical traffic situations when consequences are only probabilistically known (a situation of risk) or even unknown (a situation of uncertainty)?  How do people retrospectively evaluate the behavior of an AV in situations in which a road user has been harmed? We addressed these questions in two empirical studies (N = 1,638) that approximated the real-world conditions under which AVs operate by varying the degree of risk and uncertainty of the situation. In Experiment 1, subjects learned that an AV had to decide between staying in the lane or swerving. Each action could lead to a collision with another road user, with some known or unknown likelihood. Subjects' decision preferences and moral judgments varied considerably with specified probabilities under risk, yet less so under uncertainty. The results suggest that staying in the lane and performing an emergency stop is considered a reasonable default, even when this action does not minimize expected loss. Experiment 2 demonstrated that if an AV collided with another road user, subjects' retrospective evaluations of the default action were also more robust against unwanted outcome and hindsight effects than the alternative swerve maneuver. The findings highlight the importance of investigating moral judgments under risk and uncertainty in order to develop policies that are societally acceptable even under critical conditions.
© 2018 Society for Risk Analysis.

Entities:  

Keywords:  Autonomous vehicles; defaults; moral judgment under risk and uncertainty

Mesh:

Year:  2018        PMID: 30157299     DOI: 10.1111/risa.13178

Source DB:  PubMed          Journal:  Risk Anal        ISSN: 0272-4332            Impact factor:   4.000


  5 in total

1.  Human decision-making biases in the moral dilemmas of autonomous vehicles.

Authors:  Darius-Aurel Frank; Polymeros Chrysochou; Panagiotis Mitkidis; Dan Ariely
Journal:  Sci Rep       Date:  2019-09-11       Impact factor: 4.379

2.  Solving the Single-Vehicle Self-Driving Car Trolley Problem Using Risk Theory and Vehicle Dynamics.

Authors:  Rebecca Davnall
Journal:  Sci Eng Ethics       Date:  2019-04-01       Impact factor: 3.525

3.  Moral Judgements on the Actions of Self-Driving Cars and Human Drivers in Dilemma Situations From Different Perspectives.

Authors:  Noa Kallioinen; Maria Pershina; Jannik Zeiser; Farbod Nosrat Nezami; Gordon Pipa; Achim Stephan; Peter König
Journal:  Front Psychol       Date:  2019-11-01

4.  Influence of Social Distance Expressed by Driving Support Agent's Utterance on Psychological Acceptability.

Authors:  Tomoki Miyamoto; Daisuke Katagami; Yuka Shigemitsu; Mayumi Usami; Takahiro Tanaka; Hitoshi Kanamori; Yuki Yoshihara; Kazuhiro Fujikake
Journal:  Front Psychol       Date:  2021-02-24

5.  Moral judgment, decision times and emotional salience of a new developed set of sacrificial manual driving dilemmas.

Authors:  Giovanni Bruno; Michela Sarlo; Lorella Lotto; Nicola Cellini; Simone Cutini; Andrea Spoto
Journal:  Curr Psychol       Date:  2022-01-12
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.