Dynamic Knowledge Distillation for Pre-trained Language Models

Lei Li, Yankai Lin, Shuhuai Ren, Peng Li, Jie Zhou, Xu Sun 0001. Dynamic Knowledge Distillation for Pre-trained Language Models. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021. pages 379-389, Association for Computational Linguistics, 2021. [doi]

Authors

Lei Li

This author has not been identified. Look up 'Lei Li' in Google

Yankai Lin

This author has not been identified. Look up 'Yankai Lin' in Google

Shuhuai Ren

This author has not been identified. Look up 'Shuhuai Ren' in Google

Peng Li

This author has not been identified. Look up 'Peng Li' in Google

Jie Zhou

This author has not been identified. Look up 'Jie Zhou' in Google

Xu Sun 0001

This author has not been identified. Look up 'Xu Sun 0001' in Google