INK: Injecting kNN Knowledge in Nearest Neighbor Machine Translation

Wenhao Zhu, Jingjing Xu, Shujian Huang, Lingpeng Kong, Jiajun Chen


Abstract
Neural machine translation has achieved promising results on many translation tasks. However, previous studies have shown that neural models induce a non-smooth representation space, which harms its generalization results. Recently, kNN-MT has provided an effective paradigm to smooth the prediction based on neighbor representations during inference. Despite promising results, kNN-MT usually requires large inference overhead. We propose an effective training framework INK to directly smooth the representation space via adjusting representations of kNN neighbors with a small number of new parameters. The new parameters are then used to refresh the whole representation datastore to get new kNN knowledge asynchronously. This loop keeps running until convergence. Experiments on four benchmark datasets show that INK achieves average gains of 1.99 COMET and 1.0 BLEU, outperforming the state-of-the-art kNN-MT system with 0.02x memory space and 1.9x inference speedup.
Anthology ID:
2023.acl-long.888
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15948–15959
Language:
URL:
https://aclanthology.org/2023.acl-long.888
DOI:
10.18653/v1/2023.acl-long.888
Bibkey:
Cite (ACL):
Wenhao Zhu, Jingjing Xu, Shujian Huang, Lingpeng Kong, and Jiajun Chen. 2023. INK: Injecting kNN Knowledge in Nearest Neighbor Machine Translation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 15948–15959, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
INK: Injecting kNN Knowledge in Nearest Neighbor Machine Translation (Zhu et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.888.pdf
Video:
 https://aclanthology.org/2023.acl-long.888.mp4