The following publications are possibly variants of this publication:
- Student Customized Knowledge Distillation: Bridging the Gap Between Student and TeacherYichen Zhu, Yi Wang. iccv 2021: 5037-5046 [doi]
- Customizing Student Networks From Heterogeneous Teachers via Adaptive Knowledge AmalgamationChengchao Shen, Mengqi Xue, Xinchao Wang, Jie Song, Li Sun, Mingli Song. iccv 2019: 3503-3512 [doi]
- Curriculum Temperature for Knowledge DistillationZheng Li, Xiang Li, Lingfeng Yang, Borui Zhao, Renjie Song, Lei Luo, Jun Li, Jian Yang. AAAI 2023: 1504-1512 [doi]
- MKD-Cooper: Cooperative 3D Object Detection for Autonomous Driving via Multi-Teacher Knowledge DistillationZhiyuan Li, Huawei Liang, Hanqi Wang, Mingzhuo Zhao, Jian Wang, Xiaokun Zheng. tiv, 9(1):1490-1500, January 2024. [doi]
- Knowledge Distillation via Multi-Teacher Feature EnsembleXin Ye, Rongxin Jiang 0001, Xiang Tian 0002, Rui Zhang, Yaowu Chen. spl, 31:566-570, 2024. [doi]
- Segmenting Neuronal Structure in 3D Optical Microscope Images via Knowledge Distillation with Teacher-Student NetworkHeng Wang, Donghao Zhang, Yang Song, Siqi Liu, Yue Wang, Dagan Feng, Hanchuan Peng, Weidong Cai. isbi 2019: 228-231 [doi]