Aligning Predictive Uncertainty with Clarification Questions in Grounded Dialog

Kata Naszadi, Putra Manggala, Christof Monz


Abstract
Asking for clarification is fundamental to effective collaboration. An interactive artificial agent must know when to ask a human instructor for more information in order to ascertain their goals. Previous work bases the timing of questions on supervised models learned from interactions between humans. Instead of a supervised classification task, we wish to ground the need for questions in the acting agent’s predictive uncertainty. In this work, we investigate if ambiguous linguistic instructions can be aligned with uncertainty in neural models. We train an agent using the T5 encoder-decoder architecture to solve the Minecraft Collaborative Building Task and identify uncertainty metrics that achieve better distributional separation between clear and ambiguous instructions. We further show that well-calibrated prediction probabilities benefit the detection of ambiguous instructions. Lastly, we provide a novel empirical analysis on the relationship between uncertainty and dialog history length and highlight an important property that poses a difficulty for detection.
Anthology ID:
2023.findings-emnlp.999
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14988–14998
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.999
DOI:
10.18653/v1/2023.findings-emnlp.999
Bibkey:
Cite (ACL):
Kata Naszadi, Putra Manggala, and Christof Monz. 2023. Aligning Predictive Uncertainty with Clarification Questions in Grounded Dialog. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14988–14998, Singapore. Association for Computational Linguistics.
Cite (Informal):
Aligning Predictive Uncertainty with Clarification Questions in Grounded Dialog (Naszadi et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.999.pdf