Multilingual Continual Learning Approaches for Text Classification

Karan Praharaj, Irina Matveeva


Abstract
Multilingual continual learning is important for models that are designed to be deployed over long periods of time and are required to be updated when new data becomes available. Such models are continually applied to new unseen data that can be in any of the supported languages. One challenge in this scenario is to ensure consistent performance of the model throughout the deployment lifecycle, beginning from the moment of first deployment. We empirically assess the strengths and shortcomings of some continual learning methods in a multilingual setting across two tasks.
Anthology ID:
2023.ranlp-1.93
Volume:
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
864–870
Language:
URL:
https://aclanthology.org/2023.ranlp-1.93
DOI:
Bibkey:
Cite (ACL):
Karan Praharaj and Irina Matveeva. 2023. Multilingual Continual Learning Approaches for Text Classification. In Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing, pages 864–870, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Multilingual Continual Learning Approaches for Text Classification (Praharaj & Matveeva, RANLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.ranlp-1.93.pdf