The following publications are possibly variants of this publication:
- MixSKD: Self-Knowledge Distillation from Mixup for Image RecognitionChuanguang Yang, Zhulin An, Helong Zhou, Linhang Cai, Xiang-zhi, Jiwen Wu, Yongjun Xu, Qian Zhang 0009. eccv 2022: 534-551 [doi]
- UKD: Debiasing Conversion Rate Estimation via Uncertainty-regularized Knowledge DistillationZixuan Xu, Penghui Wei, Weimin Zhang, Shaoguo Liu, Liang Wang, Bo Zheng. WWW 2022: 2078-2087 [doi]