The following publications are possibly variants of this publication:
- Adaptive Block-Wise Regularization and Knowledge Distillation for Enhancing Federated LearningJianChun Liu, Qingmin Zeng, Hongli Xu, Yang Xu 0020, Zhiyuan Wang, He Huang 0001. ton, 32(1):791-805, February 2024. [doi]
- FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge DistillationLeiming Chen, Weishan Zhang, Cihao Dong, Dehai Zhao, Xingjie Zeng, Sibo Qiao, Yichang Zhu, Chee-Wei Tan 0001. entropy, 26(1):96, January 2024. [doi]
- Resource-Aware Knowledge Distillation for Federated LearningZheyi Chen, Pu Tian, Weixian Liao, Xuhui Chen, Guobin Xu, Wei Yu 0002. tetc, 11(3):706-719, July - September 2023. [doi]