Structural generalization in COGS: Supertagging is (almost) all you need

Alban Petit, Caio Corro, François Yvon


Abstract
In many Natural Language Processing applications, neural networks have been found to fail to generalize on out-of-distribution examples. In particular, several recent semantic parsing datasets have put forward important limitations of neural networks in cases where compositional generalization is required. In this work, we extend a neural graph-based parsing framework in several ways to alleviate this issue, notably: (1) the introduction of a supertagging step with valency constraints, expressed as an integer linear program; (2) the reduction of the graph prediction problem to the maximum matching problem; (3) the design of an incremental early-stopping training strategy to prevent overfitting. Experimentally, our approach significantly improves results on examples that require structural generalization in the COGS dataset, a known challenging benchmark for compositional generalization. Overall, these results confirm that structural constraints are important for generalization in semantic parsing.
Anthology ID:
2023.emnlp-main.69
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1089–1101
Language:
URL:
https://aclanthology.org/2023.emnlp-main.69
DOI:
10.18653/v1/2023.emnlp-main.69
Bibkey:
Cite (ACL):
Alban Petit, Caio Corro, and François Yvon. 2023. Structural generalization in COGS: Supertagging is (almost) all you need. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 1089–1101, Singapore. Association for Computational Linguistics.
Cite (Informal):
Structural generalization in COGS: Supertagging is (almost) all you need (Petit et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.69.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.69.mp4