Learning Event-aware Measures for Event Coreference Resolution

Yao Yao, Zuchao Li, Hai Zhao


Abstract
Researchers are witnessing knowledge-inspired natural language processing shifts the focus from entity-level to event-level, whereas event coreference resolution is one of the core challenges. This paper proposes a novel model for within-document event coreference resolution. On the basis of event but not entity as before, our model learns and integrates multiple representations from both event alone and event pair. For the former, we introduce multiple linguistics-motivated event alone features for more discriminative event representations. For the latter, we consider multiple similarity measures to capture the distinction of event pair. Our proposed model achieves new state-of-the-art on the ACE 2005 benchmark, demonstrating the effectiveness of our proposed framework.
Anthology ID:
2023.findings-acl.855
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13542–13556
Language:
URL:
https://aclanthology.org/2023.findings-acl.855
DOI:
10.18653/v1/2023.findings-acl.855
Bibkey:
Cite (ACL):
Yao Yao, Zuchao Li, and Hai Zhao. 2023. Learning Event-aware Measures for Event Coreference Resolution. In Findings of the Association for Computational Linguistics: ACL 2023, pages 13542–13556, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Learning Event-aware Measures for Event Coreference Resolution (Yao et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.855.pdf
Video:
 https://aclanthology.org/2023.findings-acl.855.mp4