The following publications are possibly variants of this publication:
- Understanding and Improving Knowledge Distillation for Quantization Aware Training of Large Transformer EncodersMinsoo Kim, Sihwa Lee, Sukjin Hong, Du-Seong Chang, Jungwook Choi. emnlp 2022: 6713-6725 [doi]
- Wavelet Knowledge Distillation via Decoupled Target for Scene Text DetectionKefan Qu, Jianmin Lin, Jinrong Li, Ming Yang, Wangpeng He. icig 2023: 150-161 [doi]
- Multi-target Knowledge Distillation via Student Self-reflectionJianping Gou, Xiangshuo Xiong, Baosheng Yu, Lan Du 0002, Yibing Zhan, Dacheng Tao. ijcv, 131(7):1857-1874, July 2023. [doi]
- Improving drug-target affinity prediction via feature fusion and knowledge distillationRuiqiang Lu, Jun Wang, Pengyong Li, Yuquan Li, Shuoyan Tan, Yiting Pan, Huanxiang Liu, Peng Gao, Guotong Xie, Xiaojun Yao. bib, 24(3), May 2023. [doi]