The following publications are possibly variants of this publication:
- 3KD: Knowledge distillation via teacher-student cooperative curriculum customizationChaofei Wang, Ke Yang, Shaowei Zhang, Gao Huang, Shiji Song. ijon, 508:284-292, 2022. [doi]
- Robust distillation for worst-class performance: on the interplay between teacher and student objectivesSerena Lutong Wang, Harikrishna Narasimhan, Yichen Zhou, Sara Hooker, Michal Lukasik, Aditya Krishna Menon. uai 2023: 2237-2247 [doi]
- One-Teacher and Multiple-Student Knowledge Distillation on Sentiment ClassificationXiaoqin Chang, Sophia Yat Mei Lee, Suyang Zhu, Shoushan Li, Guodong Zhou. COLING 2022: 7042-7052 [doi]