Multilingual Bidirectional Unsupervised Translation through Multilingual Finetuning and Back-Translation

Bryan Li, Mohammad Sadegh Rasooli, Ajay Patel, Chris Callison-burch


Abstract
We propose a two-stage approach for training a single NMT model to translate unseen languages both to and from English. For the first stage, we initialize an encoder-decoder model to pretrained XLM-R and RoBERTa weights, then perform multilingual fine-tuning on parallel data in 40 languages to English. We find this model can generalize to zero-shot translations on unseen languages. For the second stage, we leverage this generalization ability to generate synthetic parallel data from monolingual datasets, then bidirectionally train with successive rounds of back-translation. Our approach, which we EcXTra (uE/unglish-uc/uentric Crosslingual (uX/u) uTra/unsfer), is conceptually simple, only using a standard cross-entropy objective throughout. It is also data-driven, sequentially leveraging auxiliary parallel data and monolingual data. We evaluate unsupervised NMT results for 7 low-resource languages, and find that each round of back-translation training further refines bidirectional performance. Our final single EcXTra-trained model achieves competitive translation performance in all translation directions, notably establishing a new state-of-the-art for English-to-Kazakh (22.9 10.4 BLEU). Our code is available at [this URL](https://github.com/manestay/EcXTra).
Anthology ID:
2023.loresmt-1.2
Volume:
Proceedings of the Sixth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2023)
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Atul Kr. Ojha, Chao-hong Liu, Ekaterina Vylomova, Flammie Pirinen, Jade Abbott, Jonathan Washington, Nathaniel Oco, Valentin Malykh, Varvara Logacheva, Xiaobing Zhao
Venue:
LoResMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16–31
Language:
URL:
https://aclanthology.org/2023.loresmt-1.2
DOI:
10.18653/v1/2023.loresmt-1.2
Bibkey:
Cite (ACL):
Bryan Li, Mohammad Sadegh Rasooli, Ajay Patel, and Chris Callison-burch. 2023. Multilingual Bidirectional Unsupervised Translation through Multilingual Finetuning and Back-Translation. In Proceedings of the Sixth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2023), pages 16–31, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Multilingual Bidirectional Unsupervised Translation through Multilingual Finetuning and Back-Translation (Li et al., LoResMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.loresmt-1.2.pdf
Video:
 https://aclanthology.org/2023.loresmt-1.2.mp4