The following publications are possibly variants of this publication:
- Privileged Graph Distillation for Cold Start RecommendationShuai Wang, Kun Zhang, Le Wu, Haiping Ma, Richang Hong, Meng Wang. sigir 2021: 1187-1196 [doi]
- Toward Understanding Privileged Features Distillation in Learning-to-RankShuo Yang, Sujay Sanghavi, Holakou Rahmanian, Jan Bakus, S. V. N. Vishwanathan. nips 2022: [doi]
- Uncertainty-based Heterogeneous Privileged Knowledge Distillation for Recommendation SystemAng Li, Jian Hu, Ke Ding, Xiaolu Zhang, Jun Zhou, Yong He, Xu Min. sigir 2023: 2471-2475 [doi]
- Calibration-compatible Listwise Distillation of Privileged Features for CTR PredictionXiaoqiang Gui, Yueyao Cheng, Xiang-Rong Sheng, Yunfeng Zhao, Guoxian Yu, Shuguang Han, Yuning Jiang, Jian Xu, Bo Zheng. wsdm 2024: 247-256 [doi]
- Privileged Prior Information Distillation for Image MattingCheng Lyu, Jiake Xie, Bo Xu, Cheng Lu 0006, Han Huang 0005, Xin Huang, Ming Wu 0001, Chuang Zhang, Yong Tang. AAAI 2024: 4044-4052 [doi]