Machines and humans in sacrificial moral dilemmas: Required similarly but judged differently?
Cognition
; 239: 105575, 2023 10.
Article
in En
| MEDLINE
| ID: mdl-37517138
ABSTRACT
There is an increasing interest in understanding human-machine differences in morality. Prior research relying on Trolley-like, moral-impersonal dilemmas suggests that people might apply similar norms to humans and machines but judge their identical decisions differently. We examined people's moral norm imposed on humans and robots (Study 1) and moral judgment of their decisions (Study 2) in Trolley and Footbridge dilemmas. Participants imposed similar, utilitarian norms to them in Trolley but different norms in Footbridge where fewer participants thought humans versus robots should take action in the moral-personal dilemma. Unlike previous research, we witnessed a norm-judgment symmetry that prospective norm aligns with retrospective judgment. The more required decision was judged more moral across agents and dilemmas. We discussed the theoretical implications for machine morality.
Key words
Full text:
1
Collection:
01-internacional
Database:
MEDLINE
Main subject:
Decision Making
/
Judgment
Type of study:
Observational_studies
Limits:
Humans
Language:
En
Journal:
Cognition
Year:
2023
Document type:
Article
Affiliation country:
China