CLC number: TP393
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2023-10-17
Cited: 0
Clicked: 2596
Yizhuo CAI, Bo LEI, Qianying ZHAO, Jing PENG, Min WEI, Yushun ZHANG, Xing ZHANG. Communication efficiency optimization of federated learning for computing and network convergence of 6G networks[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2300122 @article{title="Communication efficiency optimization of federated learning for computing and network convergence of 6G networks", %0 Journal Article TY - JOUR
面向6G算力网络的联邦学习通信效率优化1北京邮电大学无线信号处理与网络实验室,中国北京市,100876 2中国电信股份有限公司研究院,中国北京市,102209 3中国电信股份有限公司北京分公司,中国北京市,100011 摘要:联邦学习以参与设备之间协作训练全局模型的形式,有效地解决了数据隐私等问题。然而,在复杂的网络环境中,网络拓扑和设备算力等因素极其影响联邦学习的训练和通信过程。作为一种算力可测、可感知、可分配、可调度和可管理的新型网络架构和范式,6G中的算力网络恰好能有效支持联邦学习训练并提高其通信效率。根据业务需求、资源负载、网络条件和设备算力等信息,算力网络可以决策联邦学习的训练进而实现通信效率提高。为了提高复杂网络环境下联邦学习的通信效率,本文研究了其在6G算力网络中的通信效率优化方法,针对不同的网络条件和参与设备的算力作出训练过程的决策。仿真实验基于联邦学习中存在的两种架构,依据算力信息调度设备参与训练,并在传输模型参数的过程中实现通信效率的优化。仿真结果表明,本文提出的方法能够很好地应对复杂的网络情况,有效平衡参与设备的本地训练延迟差异,提高在传输模型参数时的通信效率,并提高网络中的资源利用率。 关键词组: Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article
Reference[1]Cao QM, Zhang X, Zhang YS, et al., 2021. Layered model aggregation based federated learning in mobile edge networks. IEEE/CIC Int Conf on Communications in China, p.1-6. ![]() [2]Chen MZ, Yang ZH, Saad W, et al., 2021. A joint learning and communications framework for federated learning over wireless networks. IEEE Trans Wirel Commun, 20(1):269-283. ![]() [3]Deng YH, Lyu F, Ren J, et al., 2021. Share: shaping data distribution at edge for communication-efficient hierarchical federated learning. IEEE 41st Int Conf on Distributed Computing Systems, p.24-34. ![]() [4]Dinh CT, Tran NH, Nguyen MNH, et al., 2021. Federated learning over wireless networks: convergence analysis and resource allocation. IEEE/ACM Trans Netw, 29(1):398-409. ![]() [5]Fraboni Y, Vidal R, Kameni L, et al., 2021. Clustered sampling: low-variance and improved representativity for clients selection in federated learning. Proc 38th Int Conf on Machine Learning, p.3407-3416. ![]() [6]He CY, Annavaram M, Avestimehr S, 2020. Group knowledge transfer: federated learning of large CNNs at the edge. Proc 34 th Int Conf on Neural Information Processing Systems, p.14068-14080. ![]() [7]Kairouz P, McMahan HB, Avent B, et al., 2021. Advances and open problems in federated learning. Found Trends Mach Learn, 14(1-2):1-210. ![]() [8]Konečný J, McMahan HB, Yu FX, et al., 2016. Federated learning: strategies for improving communication efficiency. ![]() [9]Li DL, Wang JP, 2019. FedMD: heterogenous federated learning via model distillation. ![]() [10]Li T, Sahu AK, Talwalkar A, et al., 2020. Federated learning: challenges, methods, and future directions. IEEE Signal Process Mag, 37(3):50-60. ![]() [11]Lin T, Kong LJ, Stich SU, et al., 2020. Ensemble distillation for robust model fusion in federated learning. Proc 34 th Int Conf on Neural Information Processing Systems, p.2351-2363. ![]() [12]Liu LM, Zhang J, Song SH, et al., 2020. Client-edge-cloud hierarchical federated learning. IEEE Int Conf on Communications, p.1-6. ![]() [13]McMahan HB, Moore E, Ramage D, et al., 2017. Communication-efficient learning of deep networks from decentralized data. Proc 20 th Int Conf on Artificial Intelligence and Statistics, p.1273-1282. ![]() [14]Pinyoanuntapong P, Janakaraj P, Wang P, et al., 2020. FedAir: towards multi-hop federated learning over-the-air. IEEE 21st Int Workshop on Signal Processing Advances in Wireless Communications, p.1-5. ![]() [15]Qin ZJ, Li GY, Ye H, 2021. Federated learning and wireless communications. IEEE Wirel Commun, 28(5):134-140. ![]() [16]So J, Güler B, Avestimehr AS, 2021. Turbo-aggregate: breaking the quadratic aggregation barrier in secure federated learning. IEEE J Sel Areas Inform Theory, 2(1):479-489. ![]() [17]Sun W, Li ZJ, Wang QBJ, et al., 2023. FedTAR: task and resource-aware federated learning for wireless computing power networks. IEEE Int Things J, 10(5):4257-4270. ![]() [18]Sun YK, Lei B, Liu J, et al., 2022. Computing power network: a survey. China Commun, early access. ![]() [19]Wahab OA, Mourad A, Otrok H, et al., 2021. Federated machine learning: survey, multi-level classification, desirable criteria and future directions in communication and networking systems. IEEE Commun Surv Tut, 23(2):1342-1397. ![]() [20]Wu WT, He LG, Lin WW, et al., 2021. Safa: a semi-asynchronous protocol for fast federated learning with low overhead. IEEE Trans Comput, 70(5):655-668. ![]() [21]Yang HH, Liu ZZ, Quek TQS, et al., 2020. Scheduling policies for federated learning in wireless networks. IEEE Trans Commun, 68(1):317-333. ![]() [22]Yang ZH, Chen MZ, Saad W, et al., 2021. Energy efficient federated learning over wireless communication networks. IEEE Trans Wirel Commun, 20(3):1935-1949. ![]() [23]Yang ZH, Chen MZ, Wong KK, et al., 2022. Federated learning for 6G: applications, challenges, and opportunities. Engineering, 8:33-41. ![]() Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou
310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn Copyright © 2000 - 2025 Journal of Zhejiang University-SCIENCE |
Open peer comments: Debate/Discuss/Question/Opinion
<1>