Meta-training with Demonstration Retrieval for Efficient Few-shot Learning

Aaron Mueller, Kanika Narang, Lambert Mathias, Qifan Wang, Hamed Firooz


Abstract
Large language models show impressive results on few-shot NLP tasks. However, these models are memory and computation-intensive. Meta-training allows one to leverage smaller models for few-shot generalization in a domain-general and task-agnostic manner; however, these methods alone results in models that may not have sufficient parameterization or knowledge to adapt quickly to a large variety of tasks. To overcome this issue, we propose meta-training with demonstration retrieval, where we use a dense passage retriever to retrieve semantically similar labeled demonstrations to each example for more varied supervision. By separating external knowledge from model parameters, we can use meta-training to train parameter-efficient models that generalize well on a larger variety of tasks. We construct a meta-training set from UnifiedQA and CrossFit, and propose a demonstration bank based on UnifiedQA tasks. To our knowledge, our work is the first to combine retrieval with meta-training, to use DPR models to retrieve demonstrations, and to leverage demonstrations from many tasks simultaneously, rather than randomly sampling demonstrations from the training set of the target task. Our approach outperforms a variety of targeted parameter-efficient and retrieval-augmented few-shot methods on QA, NLI, and text classification tasks (including SQuAD, QNLI, and TREC). Our approach can be meta-trained and fine-tuned quickly on a single GPU.
Anthology ID:
2023.findings-acl.376
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6049–6064
Language:
URL:
https://aclanthology.org/2023.findings-acl.376
DOI:
10.18653/v1/2023.findings-acl.376
Bibkey:
Cite (ACL):
Aaron Mueller, Kanika Narang, Lambert Mathias, Qifan Wang, and Hamed Firooz. 2023. Meta-training with Demonstration Retrieval for Efficient Few-shot Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6049–6064, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Meta-training with Demonstration Retrieval for Efficient Few-shot Learning (Mueller et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.376.pdf