The Queen of England is not England’s Queen: On the Lack of Factual Coherency in PLMs

Paul Youssef, Jörg Schlötterer, Christin Seifert


Abstract
Factual knowledge encoded in Pre-trained Language Models (PLMs) enriches their representations and justifies their use as knowledge bases. Previous work has focused on probing PLMs for factual knowledge by measuring how often they can correctly predict an _object_ entity given a subject and a relation, and improving fact retrieval by optimizing the prompts used for querying PLMs. In this work, we consider a complementary aspect, namely the coherency of factual knowledge in PLMs, i.e., how often can PLMs predict the _subject_ entity given its initial prediction of the object entity. This goes beyond evaluating how much PLMs know, and focuses on the internal state of knowledge inside them. Our results indicate that PLMs have low coherency using manually written, optimized and paraphrased prompts, but including an evidence paragraph leads to substantial improvement. This shows that PLMs fail to model inverse relations and need further enhancements to be able to handle retrieving facts from their parameters in a coherent manner, and to be considered as knowledge bases.
Anthology ID:
2024.findings-eacl.155
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2342–2354
Language:
URL:
https://aclanthology.org/2024.findings-eacl.155
DOI:
Bibkey:
Cite (ACL):
Paul Youssef, Jörg Schlötterer, and Christin Seifert. 2024. The Queen of England is not England’s Queen: On the Lack of Factual Coherency in PLMs. In Findings of the Association for Computational Linguistics: EACL 2024, pages 2342–2354, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
The Queen of England is not England’s Queen: On the Lack of Factual Coherency in PLMs (Youssef et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.155.pdf