Dynamic Knowledge Distillation for Pre-trained Language Models

Lei Li, Yankai Lin, Shuhuai Ren, Peng Li, Jie Zhou, Xu Sun 0001. Dynamic Knowledge Distillation for Pre-trained Language Models. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021. pages 379-389, Association for Computational Linguistics, 2021. [doi]

@inproceedings{LiLRLZ021,
  title = {Dynamic Knowledge Distillation for Pre-trained Language Models},
  author = {Lei Li and Yankai Lin and Shuhuai Ren and Peng Li and Jie Zhou and Xu Sun 0001},
  year = {2021},
  url = {https://aclanthology.org/2021.emnlp-main.31},
  researchr = {https://researchr.org/publication/LiLRLZ021},
  cites = {0},
  citedby = {0},
  pages = {379-389},
  booktitle = {Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021},
  editor = {Marie-Francine Moens and Xuanjing Huang and Lucia Specia and Scott Wen-tau Yih},
  publisher = {Association for Computational Linguistics},
}