The following publications are possibly variants of this publication:
- Analyzing Knowledge Distillation in Neural Machine TranslationDakun Zhang, Josep Maria Crego, Jean Senellart. iwslt 2018: 23-30 [doi]
- Dual knowledge distillation for bidirectional neural machine translationHuaao Zhang, Shigui Qiu, Shilong Wu. ijcnn 2021: 1-7 [doi]
- Knowledge Distillation for Multilingual Unsupervised Neural Machine TranslationHaipeng Sun, Rui Wang 0015, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao. acl 2020: 3525-3535 [doi]
- Nearest Neighbor Knowledge Distillation for Neural Machine TranslationZhixian Yang, Renliang Sun, Xiaojun Wan 0001. naacl 2022: 5546-5556 [doi]
- Future-Aware Knowledge Distillation for Neural Machine TranslationBiao Zhang 0002, Deyi Xiong, Jinsong Su, Jiebo Luo. taslp, 27(12):2278-2287, 2019. [doi]
- Dual Knowledge Distillation for neural machine translationYuxian Wan, Wenlin Zhang, Zhen Li, Hao Zhang, Yanxia Li. csl, 84:101583, March 2024. [doi]
- Continual Knowledge Distillation for Neural Machine TranslationYuanchi Zhang, Peng Li, Maosong Sun, Yang Liu. acl 2023: 7978-7996 [doi]