The following publications are possibly variants of this publication:
- Adaptive Contrastive Knowledge Distillation for BERT CompressionJinyang Guo, Jiaheng Liu, Zining Wang, Yuqing Ma, Ruihao Gong, Ke Xu, Xianglong Liu 0001. acl 2023: 8941-8953 [doi]
- Contrastive Distillation on Intermediate Representations for Language Model CompressionSiqi Sun, Zhe Gan, Yuwei Fang, Yu Cheng 0001, Shuohang Wang, Jingjing Liu 0001. emnlp 2020: 498-508 [doi]
- Graph-Based Model Compression for HSR Bogies Fault Diagnosis at IoT Edge via Adversarial Knowledge DistillationWenqing Wan, Jinglong Chen, Jingsong Xie. tits, 25(2):1787-1796, February 2024. [doi]
- Ensemble Modeling with Contrastive Knowledge Distillation for Sequential RecommendationHanwen Du, Huanhuan Yuan, Pengpeng Zhao, Fuzhen Zhuang, Guanfeng Liu 0001, Lei Zhao 0001, Yanchi Liu, Victor S. Sheng. sigir 2023: 58-67 [doi]
- Model Selection - Knowledge Distillation Framework for Model CompressionRenhai Chen, Shimin Yuan, Shaobo Wang, Zhenghan Li, Meng-xing, Zhiyong Feng 0002. ssci 2021: 1-6 [doi]