XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation

Yong Wang, Shilin He, Guanhua Chen, Yun Chen, Daxin Jiang. XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11. pages 6934-6946, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.