The following publications are possibly variants of this publication:
- Logit Distillation via Student DiversityDingyao Chen, Long Lan, Mengzhu Wang, Xiang Zhang, Tianyi Liang, Zhigang Luo. iconip 2023: 338-349 [doi]
- Transpose and Mask: Simple and Effective Logit-Based Knowledge Distillation for Multi-attribute and Multi-label ClassificationYuwei Zhao, Annan Li, Guozhen Peng, Yunhong Wang. prcv 2024: 273-284 [doi]
- Plug and Play Knowledge Distillation for kNN-LM with External LogitsXuyang Jin, Tao Ge, Furu Wei. ijcnlp 2022: 463-469 [doi]
- Adaptive multi-teacher multi-level knowledge distillationYuang Liu, Wei Zhang, Jun Wang. ijon, 415:106-113, 2020. [doi]