The following publications are possibly variants of this publication:
- FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated LearningYinlin Zhu, Xunkai Li, Zhengyu Wu, Di Wu, Miao Hu, Rong-Hua Li. IJCAI 2024: 5716-5724 [doi]
- Combining Curriculum Learning and Knowledge Distillation for Dialogue GenerationQingqing Zhu, Xiuying Chen, Pengfei Wu, Junfei Liu, Dongyan Zhao 0001. emnlp 2021: 1284-1295 [doi]
- UNIDEAL: Curriculum Knowledge Distillation Federated LearningYuwen Yang, Chang Liu 0078, Xun Cai, Suizhi Huang, Hongtao Lu, Yue Ding 0001. icassp 2024: 7145-7149 [doi]
- Curriculum Temperature for Knowledge DistillationZheng Li, Xiang Li, Lingfeng Yang, Borui Zhao, Renjie Song, Lei Luo, Jun Li, Jian Yang. AAAI 2023: 1504-1512 [doi]
- Variational Data-Free Knowledge Distillation for Continual LearningXiaorong Li, Shipeng Wang, Jian Sun 0009, ZongBen Xu. pami, 45(10):12618-12634, October 2023. [doi]
- Data-Free Knowledge Filtering and Distillation in Federated LearningZihao Lu, Junli Wang 0001, Changjun Jiang. tbd, 11(3):1128-1143, June 2025. [doi]
- Data-Free Knowledge Distillation with Positive-Unlabeled LearningJialiang Tang, Xiaoyan Yang, Xin Cheng, Ning Jiang, Wenxin Yu, Peng Zhang. iconip 2021: 309-320 [doi]
- Curriculum Hierarchical Knowledge Distillation for Bias-Free Survival PredictionChaozhuo Li, Zhihao Tang 0002, Mingji Zhang, Zhiquan Liu, Litian Zhang, Xi Zhang 0008. IJCAI 2025: 1332-1340 [doi]