Exploring Contrast Consistency of Open-Domain Question Answering Systems on Minimally Edited Questions

Zhihan Zhang, Wenhao Yu, Zheng Ning, Mingxuan Ju, Meng Jiang


Abstract
Contrast consistency, the ability of a model to make consistently correct predictions in the presence of perturbations, is an essential aspect in NLP. While studied in tasks such as sentiment analysis and reading comprehension, it remains unexplored in open-domain question answering (OpenQA) due to the difficulty of collecting perturbed questions that satisfy factuality requirements. In this work, we collect minimally edited questions as challenging contrast sets to evaluate OpenQA models. Our collection approach combines both human annotation and large language model generation. We find that the widely used dense passage retriever (DPR) performs poorly on our contrast sets, despite fitting the training set well and performing competitively on standard test sets. To address this issue, we introduce a simple and effective query-side contrastive loss with the aid of data augmentation to improve DPR training. Our experiments on the contrast sets demonstrate that DPR’s contrast consistency is improved without sacrificing its accuracy on the standard test sets.1
Anthology ID:
2023.tacl-1.61
Volume:
Transactions of the Association for Computational Linguistics, Volume 11
Month:
Year:
2023
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1082–1096
Language:
URL:
https://aclanthology.org/2023.tacl-1.61
DOI:
10.1162/tacl_a_00591
Bibkey:
Cite (ACL):
Zhihan Zhang, Wenhao Yu, Zheng Ning, Mingxuan Ju, and Meng Jiang. 2023. Exploring Contrast Consistency of Open-Domain Question Answering Systems on Minimally Edited Questions. Transactions of the Association for Computational Linguistics, 11:1082–1096.
Cite (Informal):
Exploring Contrast Consistency of Open-Domain Question Answering Systems on Minimally Edited Questions (Zhang et al., TACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.tacl-1.61.pdf
Video:
 https://aclanthology.org/2023.tacl-1.61.mp4