Self-Influence Guided Data Reweighting for Language Model Pre-training

Megh Thakkar, Tolga Bolukbasi, Sriram Ganapathy, Shikhar Vashishth, Sarath Chandar, Partha Talukdar


Abstract
Language Models (LMs) pre-trained with selfsupervision on large text corpora have become the default starting point for developing models for various NLP tasks. Once the pre-training corpus has been assembled, all data samples in the corpus are treated with equal importance during LM pre-training. However, due to varying levels of relevance and quality of data, equal importance to all the data samples may not be the optimal choice. While data reweighting has been explored in the context of task-specific supervised learning and LM fine-tuning, model-driven reweighting for pretraining data has not been explored. We fill this important gap and propose PRESENCE, a method for jointly reweighting samples by leveraging self-influence (SI) scores as an indicator of sample importance and pre-training. PRESENCE promotes novelty and stability for model pre-training. Through extensive analysis spanning multiple model sizes, datasets, and tasks, we present PRESENCE as an important first step in the research direction of sample reweighting for pre-training language models.
Anthology ID:
2023.emnlp-main.125
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2033–2045
Language:
URL:
https://aclanthology.org/2023.emnlp-main.125
DOI:
10.18653/v1/2023.emnlp-main.125
Bibkey:
Cite (ACL):
Megh Thakkar, Tolga Bolukbasi, Sriram Ganapathy, Shikhar Vashishth, Sarath Chandar, and Partha Talukdar. 2023. Self-Influence Guided Data Reweighting for Language Model Pre-training. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2033–2045, Singapore. Association for Computational Linguistics.
Cite (Informal):
Self-Influence Guided Data Reweighting for Language Model Pre-training (Thakkar et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.125.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.125.mp4