Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing

Haoyu He, Xingjian Shi, Jonas Mueller, Sheng Zha, Mu Li 0003, George Karypis. Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing. In Nafise Sadat Moosavi, Iryna Gurevych, Angela Fan, Thomas Wolf 0008, Yufang Hou 0001, Ana Marasovic, Sujith Ravi, editors, Proceedings of the Second Workshop on Simple and Efficient Natural Language Processing, SustaiNLP@EMNLP 2021, Virtual, November 10, 2021. pages 119-133, Association for Computational Linguistics, 2021. [doi]

@inproceedings{HeSMZ0K21,
  title = {Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing},
  author = {Haoyu He and Xingjian Shi and Jonas Mueller and Sheng Zha and Mu Li 0003 and George Karypis},
  year = {2021},
  url = {https://aclanthology.org/2021.sustainlp-1.13},
  researchr = {https://researchr.org/publication/HeSMZ0K21},
  cites = {0},
  citedby = {0},
  pages = {119-133},
  booktitle = {Proceedings of the Second Workshop on Simple and Efficient Natural Language Processing, SustaiNLP@EMNLP 2021, Virtual, November 10, 2021},
  editor = {Nafise Sadat Moosavi and Iryna Gurevych and Angela Fan and Thomas Wolf 0008 and Yufang Hou 0001 and Ana Marasovic and Sujith Ravi},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-955917-01-8},
}