DiNeR: A Large Realistic Dataset for Evaluating Compositional Generalization

Chengang Hu, Xiao Liu, Yansong Feng


Abstract
Most of the existing compositional generalization datasets are synthetically-generated, resulting in a lack of natural language variation. While there have been recent attempts to introduce non-synthetic datasets for compositional generalization, they suffer from either limited data scale or a lack of diversity in the forms of combinations. To better investigate compositional generalization with more linguistic phenomena and compositional diversity, we propose the DIsh NamE Recognition (DiNeR) task and create a large realistic Chinese dataset. Given a recipe instruction, models are required to recognize the dish name composed of diverse combinations of food, actions, and flavors. Our dataset consists of 3,811 dishes and 228,114 recipes, and involves plenty of linguistic phenomena such as anaphora, omission and ambiguity. We provide two strong baselines based on T5 and large language models (LLMs). This work contributes a challenging task, baseline methods to tackle the task, and insights into compositional generalization in the context of dish name recognition.
Anthology ID:
2023.emnlp-main.924
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14938–14947
Language:
URL:
https://aclanthology.org/2023.emnlp-main.924
DOI:
10.18653/v1/2023.emnlp-main.924
Bibkey:
Cite (ACL):
Chengang Hu, Xiao Liu, and Yansong Feng. 2023. DiNeR: A Large Realistic Dataset for Evaluating Compositional Generalization. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14938–14947, Singapore. Association for Computational Linguistics.
Cite (Informal):
DiNeR: A Large Realistic Dataset for Evaluating Compositional Generalization (Hu et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.924.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.924.mp4