Does Moral Code have a Moral Code? Probing Delphi’s Moral Philosophy

Kathleen C. Fraser, Svetlana Kiritchenko, Esma Balkir


Abstract
In an effort to guarantee that machine learning model outputs conform with human moral values, recent work has begun exploring the possibility of explicitly training models to learn the difference between right and wrong. This is typically done in a bottom-up fashion, by exposing the model to different scenarios, annotated with human moral judgements. One question, however, is whether the trained models actually learn any consistent, higher-level ethical principles from these datasets – and if so, what? Here, we probe the Allen AI Delphi model with a set of standardized morality questionnaires, and find that, despite some inconsistencies, Delphi tends to mirror the moral principles associated with the demographic groups involved in the annotation process. We question whether this is desirable and discuss how we might move forward with this knowledge.
Anthology ID:
2022.trustnlp-1.3
Volume:
Proceedings of the 2nd Workshop on Trustworthy Natural Language Processing (TrustNLP 2022)
Month:
July
Year:
2022
Address:
Seattle, U.S.A.
Editors:
Apurv Verma, Yada Pruksachatkun, Kai-Wei Chang, Aram Galstyan, Jwala Dhamala, Yang Trista Cao
Venue:
TrustNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
26–42
Language:
URL:
https://aclanthology.org/2022.trustnlp-1.3
DOI:
10.18653/v1/2022.trustnlp-1.3
Bibkey:
Cite (ACL):
Kathleen C. Fraser, Svetlana Kiritchenko, and Esma Balkir. 2022. Does Moral Code have a Moral Code? Probing Delphi’s Moral Philosophy. In Proceedings of the 2nd Workshop on Trustworthy Natural Language Processing (TrustNLP 2022), pages 26–42, Seattle, U.S.A.. Association for Computational Linguistics.
Cite (Informal):
Does Moral Code have a Moral Code? Probing Delphi’s Moral Philosophy (Fraser et al., TrustNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.trustnlp-1.3.pdf
Video:
 https://aclanthology.org/2022.trustnlp-1.3.mp4
Data
ETHICSMoral Stories