CoAug: Combining Augmentation of Labels and Labelling Rules

Rakesh R. Menon, Bingqing Wang, Jun Araki, Zhengyu Zhou, Zhe Feng, Liu Ren


Abstract
Collecting labeled data for Named Entity Recognition (NER) tasks is challenging due to the high cost of manual annotations. Instead, researchers have proposed few-shot self-training and rule-augmentation techniques to minimize the reliance on large datasets. However, inductive biases and restricted logical language lexicon, respectively, can limit the ability of these models to perform well. In this work, we propose CoAug, a co-augmentation framework that allows us to improve few-shot models and rule-augmentation models by bootstrapping predictions from each model. By leveraging rules and neural model predictions to train our models, we complement the benefits of each and achieve the best of both worlds. In our experiments, we show that our best CoAug model can outperform strong weak-supervision-based NER models at least by 6.5 F1 points.
Anthology ID:
2023.findings-acl.577
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9062–9071
Language:
URL:
https://aclanthology.org/2023.findings-acl.577
DOI:
10.18653/v1/2023.findings-acl.577
Bibkey:
Cite (ACL):
Rakesh R. Menon, Bingqing Wang, Jun Araki, Zhengyu Zhou, Zhe Feng, and Liu Ren. 2023. CoAug: Combining Augmentation of Labels and Labelling Rules. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9062–9071, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
CoAug: Combining Augmentation of Labels and Labelling Rules (R. Menon et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.577.pdf