The following publications are possibly variants of this publication:
- Online adversarial knowledge distillation for graph neural networksCan Wang 0001, Zhe Wang, Defang Chen 0001, Sheng Zhou 0004, Yan Feng, Chun Chen 0001. eswa, 237(Part C):121671, March 2024. [doi]
- Boosting Graph Neural Networks via Adaptive Knowledge DistillationZhichun Guo, Chunhui Zhang, Yujie Fan, Yijun Tian 0001, Chuxu Zhang, Nitesh V. Chawla. AAAI 2023: 7793-7801 [doi]
- Package Arrival Time Prediction via Knowledge Distillation Graph Neural NetworkLei Zhang 0199, Yong Liu 0020, Zhiwei Zeng, Yiming Cao, Xingyu Wu, Yonghui Xu, Zhiqi Shen 0001, LiZhen Cui. tkdd, 18(5), June 2024. [doi]
- Graph-Free Knowledge Distillation for Graph Neural NetworksXiang Deng, Zhongfei Zhang. IJCAI 2021: 2321-2327 [doi]
- Multi-teacher Knowledge Distillation for Compressed Video Action Recognition on Deep Neural NetworksMeng-Chieh Wu, Ching-Te Chiu, Kun-Hsuan Wu. icassp 2019: 2202-2206 [doi]