It is a Bird Therefore it is a Robin: On BERT’s Internal Consistency Between Hypernym Knowledge and Logical Words

Nicolas Guerin, Emmanuel Chemla


Abstract
The lexical knowledge of NLP systems shouldbe tested (i) for their internal consistency(avoiding groundedness issues) and (ii) bothfor content words and logical words. In thispaper we propose a new method to test the understandingof the hypernymy relationship bymeasuring its antisymmetry according to themodels. Previous studies often rely only on thedirect question (e.g., A robin is a ...), where weargue a correct answer could only rely on collocationalcues, rather than hierarchical cues. We show how to control for this, and how it isimportant. We develop a method to ask similarquestions about logical words that encode anentailment-like relation (e.g., because or therefore).Our results show important weaknessesof BERT-like models on these semantic tasks.
Anthology ID:
2023.findings-acl.560
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8807–8817
Language:
URL:
https://aclanthology.org/2023.findings-acl.560
DOI:
10.18653/v1/2023.findings-acl.560
Bibkey:
Cite (ACL):
Nicolas Guerin and Emmanuel Chemla. 2023. It is a Bird Therefore it is a Robin: On BERT’s Internal Consistency Between Hypernym Knowledge and Logical Words. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8807–8817, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
It is a Bird Therefore it is a Robin: On BERT’s Internal Consistency Between Hypernym Knowledge and Logical Words (Guerin & Chemla, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.560.pdf