Breaking Boundaries in Retrieval Systems: Unsupervised Domain Adaptation with Denoise-Finetuning

Che Chen, Ching Yang, Chun-Yi Lin, Hung-Yu Kao


Abstract
Dense retrieval models have exhibited remarkable effectiveness, but they rely on abundant labeled data and face challenges when applied to different domains. Previous domain adaptation methods have employed generative models to generate pseudo queries, creating pseudo datasets to enhance the performance of dense retrieval models. However, these approaches typically use unadapted rerank models, leading to potentially imprecise labels. In this paper, we demonstrate the significance of adapting the rerank model to the target domain prior to utilizing it for label generation. This adaptation process enables us to obtain more accurate labels, thereby improving the overall performance of the dense retrieval model. Additionally, by combining the adapted retrieval model with the adapted rerank model, we achieve significantly better domain adaptation results across three retrieval datasets. We release our code for future research.
Anthology ID:
2023.findings-emnlp.110
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1630–1642
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.110
DOI:
10.18653/v1/2023.findings-emnlp.110
Bibkey:
Cite (ACL):
Che Chen, Ching Yang, Chun-Yi Lin, and Hung-Yu Kao. 2023. Breaking Boundaries in Retrieval Systems: Unsupervised Domain Adaptation with Denoise-Finetuning. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 1630–1642, Singapore. Association for Computational Linguistics.
Cite (Informal):
Breaking Boundaries in Retrieval Systems: Unsupervised Domain Adaptation with Denoise-Finetuning (Chen et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.110.pdf