Building Multi-domain Dialog State Trackers from Single-domain Dialogs

Qi Zhu, Zheng Zhang, Xiaoyan Zhu, Minlie Huang


Abstract
Existing multi-domain dialog state tracking (DST) models are developed based on multi-domain dialogs, which require significant manual effort to define domain relations and collect data. This process can be challenging and expensive, particularly when numerous domains are involved. In this paper, we propose a divide-and-conquer (DAC) DST paradigm and a multi-domain dialog synthesis framework, which makes building multi-domain DST models from single-domain dialogs possible. The DAC paradigm segments a multi-domain dialog into multiple single-domain dialogs for DST, which makes models generalize better on dialogs involving unseen domain combinations. The multi-domain dialog synthesis framework merges several potentially related single-domain dialogs into one multi-domain dialog and modifies the dialog to simulate domain relations. The synthesized dialogs can help DST models capture the value transfer between domains. Experiments with three representative DST models on two datasets demonstrate the effectiveness of our proposed DAC paradigm and data synthesis framework.
Anthology ID:
2023.emnlp-main.946
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15323–15335
Language:
URL:
https://aclanthology.org/2023.emnlp-main.946
DOI:
10.18653/v1/2023.emnlp-main.946
Bibkey:
Cite (ACL):
Qi Zhu, Zheng Zhang, Xiaoyan Zhu, and Minlie Huang. 2023. Building Multi-domain Dialog State Trackers from Single-domain Dialogs. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 15323–15335, Singapore. Association for Computational Linguistics.
Cite (Informal):
Building Multi-domain Dialog State Trackers from Single-domain Dialogs (Zhu et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.946.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.946.mp4