Rethinking Task-Specific Knowledge Distillation: Contextualized Corpus as Better Textbook

Chang Liu, Chongyang Tao, Jianxin Liang, Tao Shen, Jiazhan Feng, Quzhe Huang, Dongyan Zhao 0001. Rethinking Task-Specific Knowledge Distillation: Contextualized Corpus as Better Textbook. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11. pages 10652-10658, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.