Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing

Haoyu He, Xingjian Shi, Jonas Mueller, Sheng Zha, Mu Li 0003, George Karypis. Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing. In Nafise Sadat Moosavi, Iryna Gurevych, Angela Fan, Thomas Wolf 0008, Yufang Hou 0001, Ana Marasovic, Sujith Ravi, editors, Proceedings of the Second Workshop on Simple and Efficient Natural Language Processing, SustaiNLP@EMNLP 2021, Virtual, November 10, 2021. pages 119-133, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.