Limitations of Knowledge Distillation for Zero-shot Transfer Learning

Saleh Soltan, Haidar Khan, Wael Hamza. Limitations of Knowledge Distillation for Zero-shot Transfer Learning. In Nafise Sadat Moosavi, Iryna Gurevych, Angela Fan, Thomas Wolf 0008, Yufang Hou 0001, Ana Marasovic, Sujith Ravi, editors, Proceedings of the Second Workshop on Simple and Efficient Natural Language Processing, SustaiNLP@EMNLP 2021, Virtual, November 10, 2021. pages 22-31, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.