The following publications are possibly variants of this publication:
- Communication-efficient Federated Learning Framework with Parameter-Ordered DropoutQichen Li, Sujie Shao, Chao Yang, Jiewei Chen, Feng Qi 0004, Shaoyong Guo. cscwd 2024: 1195-1200 [doi]
- Communication-Efficient Federated Learning with Adaptive Parameter FreezingChen Chen, Hong Xu, Wei Wang 0030, Baochun Li, Bo Li 0001, Li Chen, Gong Zhang. icdcs 2021: 1-11 [doi]
- To Distill or Not to Distill: Toward Fast, Accurate, and Communication-Efficient Federated Distillation LearningYuan Zhang 0013, Wenlong Zhang, Lingjun Pu, Tao Lin, Jinyao Yan. iotj, 11(6):10040-10053, March 2024. [doi]
- Two Birds With One Stone: Toward Communication and Computation Efficient Federated LearningYi Kong, Wei Yu, Shu Xu, Fei Yu 0003, Yinfei Xu, Yongming Huang 0001. icl, 28(9):2106-2110, September 2024. [doi]
- DROPFL: Client Dropout Attacks Against Federated Learning Under Communication ConstraintsWenjun Qian, Qingni Shen, Haoran Xu, Xi Huang, Zhonghai Wu. icassp 2024: 4870-4874 [doi]