Predicting Machine Translation Performance on Low-Resource Languages: The Role of Domain Similarity

Eric Khiu, Hasti Toossi, Jinyu Liu, Jiaxu Li, David Anugraha, Juan Flores, Leandro Roman, A. Seza Doğruöz, En-Shiun Lee


Abstract
Fine-tuning and testing a multilingual large language model is a challenge for low-resource languages (LRLs) since it is an expensive process. While previous studies have predicted the performance of natural language processing (NLP) tasks using machine learning methods, they primarily focus on high-resource languages, overlooking LRLs and shifts across domains. Focusing on LRLs, we investigate three factors (the size of the fine-tuning corpus, domain similarity between fine-tuning and testing corpora, and language similarity between source and target languages), which can potentially impact the model performance by using classical regression models. Our results indicate that domain similarity has the most important impact on predicting the performance of Machine Translation models.
Anthology ID:
2024.findings-eacl.100
Original:
2024.findings-eacl.100v1
Version 2:
2024.findings-eacl.100v2
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1474–1486
Language:
URL:
https://aclanthology.org/2024.findings-eacl.100
DOI:
Bibkey:
Cite (ACL):
Eric Khiu, Hasti Toossi, Jinyu Liu, Jiaxu Li, David Anugraha, Juan Flores, Leandro Roman, A. Seza Doğruöz, and En-Shiun Lee. 2024. Predicting Machine Translation Performance on Low-Resource Languages: The Role of Domain Similarity. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1474–1486, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Predicting Machine Translation Performance on Low-Resource Languages: The Role of Domain Similarity (Khiu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.100.pdf
Software:
 2024.findings-eacl.100.software.zip
Note:
 2024.findings-eacl.100.note.zip