The following publications are possibly variants of this publication:
- Unbiased Knowledge Distillation for RecommendationGang Chen, Jiawei Chen 0007, Fuli Feng, Sheng Zhou 0004, Xiangnan He 0001. wsdm 2023: 976-984 [doi]
- Adversarial Knowledge Distillation for a Compact GeneratorHideki Tsunashima, Hirokatsu Kataoka, Junji Yamato, Qiu Chen, Shigeo Morishima. icpr 2021: 10636-10643 [doi]
- Research on Knowledge Distillation of Generative Adversarial NetworksWei Wang 00250, Baohua Zhang, Tao Cui, Yimeng Chai, Yue Li 0013. dcc 2021: 376 [doi]