Where to start? Analyzing the potential value of intermediate models

Leshem Choshen, Elad Venezian, Shachar Don-Yehiya, Noam Slonim, Yoav Katz


Abstract
Previous studies observed that finetuned models may be better base models than the vanilla pretrained model. Such a model, finetuned on some source dataset, may provide a better starting point for a new finetuning process on a desired target dataset. Here, we perform a systematic analysis of this intertraining scheme, over a wide range of English classification tasks. Surprisingly, our analysis suggests that the potential intertraining gain can be analyzed independently for the target dataset under consideration, and for a base model being considered as a starting point. Hence, a performant model is generally strong, even if its training data was not aligned with the target dataset. Furthermore, we leverage our analysis to propose a practical and efficient approach to determine if and how to select a base model in real-world settings. Last, we release an updating ranking of best models in the HuggingFace hub per architecture.
Anthology ID:
2023.emnlp-main.90
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1446–1470
Language:
URL:
https://aclanthology.org/2023.emnlp-main.90
DOI:
10.18653/v1/2023.emnlp-main.90
Bibkey:
Cite (ACL):
Leshem Choshen, Elad Venezian, Shachar Don-Yehiya, Noam Slonim, and Yoav Katz. 2023. Where to start? Analyzing the potential value of intermediate models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 1446–1470, Singapore. Association for Computational Linguistics.
Cite (Informal):
Where to start? Analyzing the potential value of intermediate models (Choshen et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.90.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.90.mp4