The following publications are possibly variants of this publication:
- Enhanced Federated Learning with Adaptive Block-wise Regularization and Knowledge DistillationQingmin Zeng, JianChun Liu, Hongli Xu, Zhiyuan Wang, Yang Xu 0020, Yangming Zhao. iwqos 2023: 1-4 [doi]
- Decoupled Knowledge Distillation in Data-Free Federated LearningXueqi Sha, Yongli Wang, Ting Fang. iaic 2023: 164-177 [doi]
- FedX: Unsupervised Federated Learning with Cross Knowledge DistillationSungwon Han 0001, Sungwon Park, Fangzhao Wu, Sundong Kim, Chuhan Wu, Xing Xie 0001, Meeyoung Cha. eccv 2022: 691-707 [doi]
- Resource-Aware Knowledge Distillation for Federated LearningZheyi Chen, Pu Tian, Weixian Liao, Xuhui Chen, Guobin Xu, Wei Yu 0002. tetc, 11(3):706-719, July - September 2023. [doi]