Hexatagging: Projective Dependency Parsing as Tagging

Afra Amini, Tianyu Liu, Ryan Cotterell


Abstract
We introduce a novel dependency parser, the hexatagger, that constructs dependency trees by tagging the words in a sentence with elements from a finite set of possible tags. In contrast to many approaches to dependency parsing, our approach is fully parallelizable at training time, i.e., the structure-building actions needed to build a dependency parse can be predicted in parallel to each other. Additionally, exact decoding is linear in time and space complexity. Furthermore, we derive a probabilistic dependency parser that predicts hexatags using no more than a linear model with features from a pretrained language model, i.e., we forsake a bespoke architecture explicitly designed for the task. Despite the generality and simplicity of our approach, we achieve state-of-the-art performance of 96.4 LAS and 97.4 UAS on the Penn Treebank test set. Additionally, our parser’s linear time complexity and parallelism significantly improve computational efficiency, with a roughly 10-times speed-up over previous state-of-the-art models during decoding.
Anthology ID:
2023.acl-short.124
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1453–1464
Language:
URL:
https://aclanthology.org/2023.acl-short.124
DOI:
10.18653/v1/2023.acl-short.124
Bibkey:
Cite (ACL):
Afra Amini, Tianyu Liu, and Ryan Cotterell. 2023. Hexatagging: Projective Dependency Parsing as Tagging. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1453–1464, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Hexatagging: Projective Dependency Parsing as Tagging (Amini et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.124.pdf
Video:
 https://aclanthology.org/2023.acl-short.124.mp4