The following publications are possibly variants of this publication:
- Compressing the Multiobject Tracking Model via Knowledge DistillationTianyi Liang, Mengzhu Wang, Junyang Chen, Dingyao Chen, Zhigang Luo, Victor C. M. Leung. tcss, 11(2):2713-2723, April 2024. [doi]
- Compressing Deep Graph Neural Networks via Adversarial Knowledge DistillationHuarui He, Jie Wang, Zhanqiu Zhang, Feng Wu. kdd 2022: 534-544 [doi]
- Private Model Compression via Knowledge DistillationJi Wang, Weidong Bao, Lichao Sun, Xiaomin Zhu, Bokai Cao, Philip S. Yu. AAAI 2019: 1190-1197 [doi]
- Compressing speaker extraction model with ultra-low precision quantization and knowledge distillationYating Huang, Yunzhe Hao, Jiaming Xu 0001, Bo Xu 0002. NN, 154:13-21, 2022. [doi]
- Localized Symbolic Knowledge Distillation for Visual Commonsense ModelsJae Sung Park, Jack Hessel, Khyathi Chandu, Paul Pu Liang, Ximing Lu, Peter West, Youngjae Yu, Qiuyuan Huang, Jianfeng Gao, Ali Farhadi, Yejin Choi 0001. nips 2023: [doi]