Dynamic Knowledge Distillation for Pre-trained Language Models

Lei Li, Yankai Lin, Shuhuai Ren, Peng Li, Jie Zhou, Xu Sun 0001. Dynamic Knowledge Distillation for Pre-trained Language Models. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021. pages 379-389, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.