Revisiting Non-Autoregressive Translation at Scale

Zhihao Wang, Longyue Wang, Jinsong Su, Junfeng Yao, Zhaopeng Tu


Abstract
In real-world systems, scaling has been critical for improving the translation quality in autoregressive translation (AT), which however has not been well studied for non-autoregressive translation (NAT). In this work, we bridge the gap by systematically studying the impact of scaling on NAT behaviors. Extensive experiments on six WMT benchmarks over two advanced NAT models show that scaling can alleviate the commonly-cited weaknesses of NAT models, resulting in better translation performance. To reduce the side-effect of scaling on decoding speed, we empirically investigate the impact of NAT encoder and decoder on the translation performance. Experimental results on the large-scale WMT20 En-De show that the asymmetric architecture (e.g. bigger encoder and smaller decoder) can achieve comparable performance with the scaling model, while maintaining the superiority of decoding speed with standard NAT models. To this end, we establish a new benchmark by validating scaled NAT models on the scaled dataset, which can be regarded as a strong baseline for future works. We release code and system outputs at https://github.com/DeepLearnXMU/Scaling4NAT.
Anthology ID:
2023.findings-acl.763
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12051–12065
Language:
URL:
https://aclanthology.org/2023.findings-acl.763
DOI:
10.18653/v1/2023.findings-acl.763
Bibkey:
Cite (ACL):
Zhihao Wang, Longyue Wang, Jinsong Su, Junfeng Yao, and Zhaopeng Tu. 2023. Revisiting Non-Autoregressive Translation at Scale. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12051–12065, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Revisiting Non-Autoregressive Translation at Scale (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.763.pdf