A Read-and-Select Framework for Zero-shot Entity Linking

Zhenran Xu, Yulin Chen, Baotian Hu, Min Zhang


Abstract
Zero-shot entity linking (EL) aims at aligning entity mentions to unseen entities to challenge the generalization ability. Previous methods largely focus on the candidate retrieval stage and ignore the essential candidate ranking stage, which disambiguates among entities and makes the final linking prediction. In this paper, we propose a read-and-select (ReS) framework by modeling the main components of entity disambiguation, i.e., mention-entity matching and cross-entity comparison. First, for each candidate, the reading module leverages mention context to output mention-aware entity representations, enabling mention-entity matching. Then, in the selecting module, we frame the choice of candidates as a sequence labeling problem, and all candidate representations are fused together to enable cross-entity comparison. Our method achieves the state-of-the-art performance on the established zero-shot EL dataset ZESHEL with a 2.55% micro-average accuracy gain, with no need for laborious multi-phase pre-training used in most of the previous work, showing the effectiveness of both mention-entity and cross-entity interaction.
Anthology ID:
2023.findings-emnlp.912
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13657–13666
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.912
DOI:
10.18653/v1/2023.findings-emnlp.912
Bibkey:
Cite (ACL):
Zhenran Xu, Yulin Chen, Baotian Hu, and Min Zhang. 2023. A Read-and-Select Framework for Zero-shot Entity Linking. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13657–13666, Singapore. Association for Computational Linguistics.
Cite (Informal):
A Read-and-Select Framework for Zero-shot Entity Linking (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.912.pdf