Your browser doesn't support javascript.
loading
Machines and humans in sacrificial moral dilemmas: Required similarly but judged differently?
Chu, Yueying; Liu, Peng.
Affiliation
  • Chu Y; Center for Psychological Sciences, Zhejiang University, 310063 Hangzhou, Zhejiang, China; Department of Psychology and Behavioral Sciences, Zhejiang University, 310030 Hangzhou, Zhejiang, China.
  • Liu P; Center for Psychological Sciences, Zhejiang University, 310063 Hangzhou, Zhejiang, China. Electronic address: pengliu86@zju.edu.cn.
Cognition ; 239: 105575, 2023 10.
Article in En | MEDLINE | ID: mdl-37517138
ABSTRACT
There is an increasing interest in understanding human-machine differences in morality. Prior research relying on Trolley-like, moral-impersonal dilemmas suggests that people might apply similar norms to humans and machines but judge their identical decisions differently. We examined people's moral norm imposed on humans and robots (Study 1) and moral judgment of their decisions (Study 2) in Trolley and Footbridge dilemmas. Participants imposed similar, utilitarian norms to them in Trolley but different norms in Footbridge where fewer participants thought humans versus robots should take action in the moral-personal dilemma. Unlike previous research, we witnessed a norm-judgment symmetry that prospective norm aligns with retrospective judgment. The more required decision was judged more moral across agents and dilemmas. We discussed the theoretical implications for machine morality.
Subject(s)
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Decision Making / Judgment Type of study: Observational_studies Limits: Humans Language: En Journal: Cognition Year: 2023 Document type: Article Affiliation country: China

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Decision Making / Judgment Type of study: Observational_studies Limits: Humans Language: En Journal: Cognition Year: 2023 Document type: Article Affiliation country: China