The following publications are possibly variants of this publication:
- Complementary Relation Contrastive DistillationJinguo Zhu, Shixiang Tang, Dapeng Chen, Shijie Yu, Yakun Liu, Mingzhe Rong, Aijun Yang, Xiaohua Wang. cvpr 2021: 9260-9269 [doi]
- Serial Contrastive Knowledge Distillation for Continual Few-shot Relation ExtractionXinyi Wang, Zitao Wang, Wei Hu. acl 2023: 12693-12706 [doi]
- Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image ClassificationXiaohan Xing, Yuenan Hou, Hang Li, Yixuan Yuan, Hongsheng Li 0001, Max Q.-H. Meng. miccai 2021: 163-173 [doi]
- Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive LearningHuisi Wu, Zhaoze Wang, ZeBin Zhao, Cheng Chen 0013, Jing Qin 0001. tmi, 42(12):3794-3804, December 2023. [doi]
- Tailoring Instructions to Student's Learning Levels Boosts Knowledge DistillationYuxin Ren, Zihan Zhong, Xingjian Shi, Yi Zhu, Chun Yuan, Mu Li 0003. acl 2023: 1990-2006 [doi]
- Graph Structure Aware Contrastive Knowledge Distillation for Incremental Learning in Recommender SystemsYuening Wang, Yingxue Zhang, Mark Coates. CIKM 2021: 3518-3522 [doi]
- Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor DetectionFan Xu, Pinyun Fu, Qi Huang, Bowei Zou, AiTi Aw, Mingwen Wang. emnlp 2023: 13492-13503 [doi]