Ambiguous Learning from Retrieval: Towards Zero-shot Semantic Parsing

Shan Wu, Chunlei Xin, Hongyu Lin, Xianpei Han, Cao Liu, Jiansong Chen, Fan Yang, Guanglu Wan, Le Sun


Abstract
Current neural semantic parsers take a supervised approach requiring a considerable amount of training data which is expensive and difficult to obtain. Thus, minimizing the supervision effort is one of the key challenges in semantic parsing. In this paper, we propose the Retrieval as Ambiguous Supervision framework, in which we construct a retrieval system based on pretrained language models to collect high-coverage candidates. Assuming candidates always contain the correct ones, we convert zero-shot task into ambiguously supervised task. To improve the precision and coverage of such ambiguous supervision, we propose a confidence-driven self-training algorithm, in which a semantic parser is learned and exploited to disambiguate the candidates iteratively. Experimental results show that our approach significantly outperforms the state-of-the-art zero-shot semantic parsing methods.
Anthology ID:
2023.acl-long.787
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14081–14094
Language:
URL:
https://aclanthology.org/2023.acl-long.787
DOI:
10.18653/v1/2023.acl-long.787
Bibkey:
Cite (ACL):
Shan Wu, Chunlei Xin, Hongyu Lin, Xianpei Han, Cao Liu, Jiansong Chen, Fan Yang, Guanglu Wan, and Le Sun. 2023. Ambiguous Learning from Retrieval: Towards Zero-shot Semantic Parsing. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14081–14094, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Ambiguous Learning from Retrieval: Towards Zero-shot Semantic Parsing (Wu et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.787.pdf
Video:
 https://aclanthology.org/2023.acl-long.787.mp4