Debiasing Generative Named Entity Recognition by Calibrating Sequence Likelihood

Yu Xia, Yongwei Zhao, Wenhao Wu, Sujian Li


Abstract
Recognizing flat, overlapped and discontinuous entities uniformly has been paid increasing attention. Among these works, Seq2Seq formulation prevails for its flexibility and effectiveness. It arranges the output entities into a specific target sequence. However, it introduces bias by assigning all the probability mass to the observed sequence. To alleviate the bias, previous works either augment the data with possible sequences or resort to other formulations. In this paper, we stick to the Seq2Seq formulation and propose a reranking-based approach. It redistributes the likelihood among candidate sequences depending on their performance via a contrastive loss. Extensive experiments show that our simple yet effective method consistently boosts the baseline, and yields competitive or better results compared with the state-of-the-art methods on 8 widely-used datasets for Named Entity Recognition.
Anthology ID:
2023.acl-short.98
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1137–1148
Language:
URL:
https://aclanthology.org/2023.acl-short.98
DOI:
10.18653/v1/2023.acl-short.98
Bibkey:
Cite (ACL):
Yu Xia, Yongwei Zhao, Wenhao Wu, and Sujian Li. 2023. Debiasing Generative Named Entity Recognition by Calibrating Sequence Likelihood. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1137–1148, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Debiasing Generative Named Entity Recognition by Calibrating Sequence Likelihood (Xia et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.98.pdf
Video:
 https://aclanthology.org/2023.acl-short.98.mp4