The following publications are possibly variants of this publication:
- Self-supervised knowledge distillation for complementary label learningJiabin Liu, Biao Li, Minglong Lei, Yong Shi 0001. NN, 155:318-327, 2022. [doi]
- Multi-Label Knowledge DistillationPenghui Yang, Ming-Kun Xie, Chen-Chen Zong, Lei Feng 0006, Gang Niu 0001, Masashi Sugiyama, Sheng-Jun Huang. iccv 2023: 17225-17234 [doi]
- Improving Building Extraction by Using Knowledge Distillation to Reduce the Impact of Label NoiseGang Xu, Min Deng, Geng Sun 0005, Ya Guo 0002, Jie Chen 0048. remotesensing, 14(22):5645, 2022. [doi]
- SoLar: Sinkhorn Label Refinery for Imbalanced Partial-Label LearningHaobo Wang, Mingxuan Xia, Yixuan Li, Yuren Mao, Lei Feng 0006, Gang Chen 0001, Junbo Zhao. nips 2022: [doi]
- Multi-label Self Knowledge DistillationXucong Wang, Pengkun Wang 0001, Shurui Zhang, Miao Fang, Yang Wang 0015. AAAI 2025: 21330-21338 [doi]
- Dealing with partial labels by knowledge distillationGuangtai Wang, Jintao Huang, Yiqiang Lai, Chi-Man Vong. PR, 158:110965, 2025. [doi]
- Label-enhanced contrastive knowledge distillationZike Qiao, Ze Tao, Lingfeng He, Jian Zhang, Zailiang Chen 0001, Hui Sun. kais, 68(1):32, December 2026. [doi]
- Noise Separation guided Candidate Label Reconstruction for Noisy Partial Label LearningXiaorui Peng, Yuheng Jia, Fuchao Yang, Ran Wang 0001, Min-Ling Zhang. iclr 2025: [doi]