|
|
Regularization Optimization Algorithm for Heterogeneous Data FederatedLearning Model Based on Structure Enhancement |
ZHANG Min1,2, LIANG Meiyu1,2, XUE Zhe1,2, GUAN Zeli1,2, PAN Zhenhui1,2, ZHAO Zehua1,2 |
1. School of Computer Science, Beijing University of Posts and Telecommunications, Beijing 100876; 2. Beijing Key Laboratory of Intelligent Telecommunication Software and Multimedia, Beijing University of Posts and Telecommunications, Beijing 100876 |
|
|
Abstract In federated learning, due to the heterogeneous distribution of local data among different clients, the optimization objectives of client models trained on local datasets are inconsistent with the global model, leading to client drift and affecting the performance of global model. To address the issue of performance decline or even divergence in federated learning models caused by non-independently and identically distributed data, a regularization optimization algorithm for heterogeneous data federated learning model based on structure enhancement(FedSER) is proposed from the perspective of the generality of local models. While training on local data with heterogeneous distributions, clients sample subnetworks in a structured manner. Local data of client are augmented, and different subnetworks are trained with the augmented data to learn enhanced representations, resulting in more generalized client network models. The models counteract the client drift caused by the heterogeneity of local data and achieve a better global model in federated aggregation. Extensive experiments on the CIFAR-10, CIFAR-100 and ImageNet-200 datasets demonstrate the superior performance of FedSER.
|
Received: 13 July 2023
|
|
Fund:National Natural Science Foundation of China(No.62192784,U22B2038,62172056,62272058), CAAI-Huawei Mind-Spore Open Fund(No.CAAIXSJLJJ-2021-007B) |
Corresponding Authors:
LIANG Meiyu, Ph.D., professor. Her research interests include artificial intelligence, data mining, multimedia information processing and computer vision.
|
About author:: ZHANG Min, master student. Her research interests include federated learning.XUE Zhe, Ph.D., associate professor. His research interests include machine learning, data mining, multimodal/multi-view learning and emergency detection and analysis. GUAN Zeli, Ph.D. candidate. His research interests include federated learning, graph neu-ral network and machine learning.PAN Zhenhui, Ph.D. candidate. His research interests include federated learning and machine learning.ZHAO Zehua, master student. His research interests include efficient communication in federated learning. |
|
|
|
[1] KAIROUZ P, MCMAHAN H B, AVENT B, et al. Advances and Open Problems in Federated Learning[J/OL].[2023-06-20]. https://arxiv.org/abs/1912.04977. [2] BANABILAH S, ALOQAILY M, ALSAYED E, et al. Federated Learning Review: Fundamentals, Enabling Technologies, and Fu-ture Applications. Information Processing and Management, 2022, 59(6). DOI: 10.1016/j.ipm.2022.103061. [3] LI T, SAHU A K, TALWALKAR A, et al. Federated Learning: Challenges, Methods, and Future Directions. IEEE Signal Proce-ssing Magazine, 2020, 37(3): 50-60. [4] ZHU H Y, XU J J, LIU S Q, et al. Federated Learning on Non-IID Data: A Survey. Neurocomputing, 2021, 465: 371-390. [5] ZHANG L, SHEN L, DING L, et al. Fine-Tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 10164-10173. [6] LI Q B, DIAO Y Q, CHEN Q, et al. Federated Learning on Non-IID Data Silos: An Experimental Study // Proc of the IEEE 38th International Conference on Data Engineering. Washington, USA: IEEE, 2022: 965-978. [7] LI T, SAHU A K, ZAHEER M, et al. Federated Optimization in Heterogeneous Networks[C/OL].[2023-06-20]. hhttps://arxiv.org/abs/1812.06127. [8] KARIMIREDDY S P, KALE S, MOHRI M, et al. SCAFFOLD: Stochastic Controlled Averaging for Federated Learning[C/OL].[2023-06-20]. https://arxiv.org/pdf/1910.06378.pdf. [9] GAO L, FU H Z, LI L, et al. FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling and Correction // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 10102-10111. [10] HSU T M H, QI H, BROWN M. Federated Visual Classification with Real-World Data Distribution // Proc of the 16th European Conference on Computer Vision. Berlin, Germany: Springer, 2020: 76-92. [11] WANG J Y, LIU Q, LIANG H, et al.Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization // Proc of the 34th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2020: 7611-7623. [12] WANG H, YUROCHKIN M, SUN Y K, et al. Federated Learning with Matched Averaging[C/OL].[2023-06-20]. https://arxiv.org/abs/2002.06440. [13] HSU T M H, QI H, BROWN M. Measuring the Effects of Non-identical Data Distribution for Federated Visual Classification[C/OL].[2023-06-20]. https://arxiv.org/pdf/1909.06335.pdf. [14] LUO M, CHEN F, HU D P, et al. No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data[C/OL].[2023-06-20]. https://arxiv.org/pdf/2106.05001.pdf. [15] COLLINS L, HASSANI H, MOKHTARI A, et al. Exploiting Shared Representations for Personalized Federated Learning[C/OL].[2023-06-20]. https://arxiv.org/pdf/2102.07078.pdf. [16] YOON T, SHIN S, HWANG S J, et al. FedMix: Approximation of Mixup under Mean Augmented Federated Learning[C/OL].[2023-06-20]. https://arxiv.org/abs/2107.00233. [17] MCMAHAN H B, MOORE E, RAMAGE D, et al. Communication-Efficient Learning of deep networks from Decentralized Data[C/OL].[2023-06-20]. https://arxiv.org/pdf/1602.05629.pdf. [18] LI L, GAO K, CAO J, et al. Progressive Domain Expansion Network for Single Domain Generalization // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2021: 224-233. [19] YANG T, ZHU S, CHEN C.GradAug: A New Regularization Method for Deep Neural Networks[C/OL]. [2023-06-20].https://arxiv.org/abs/2006.07989v2. [20] YANG Q, LIU Y, CHEN T J, et al. Federated Machine Learning: Concept and Applications. ACM Transactions on Intelligent Systems and Technology, 2019, 10(2). DOI: 10.1145/3298981. [21] HUANG W K, YE M, DU B.Learn from Others and Be Yourself in Heterogeneous Federated Learning // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 10133-10143. [22] OH S, PARK J, JEONG E, et al. Mix2FLD: Downlink Federated Learning after Uplink Federated Distillation with Two-Way Mixup. IEEE Communications Letters, 2020, 24(10): 2211-2215. [23] RAMÉ A, SUN R, CORD M.MixMo: Mixing Multiple Inputs for Multiple Outputs via Deep Subnetworks // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2021: 803-813. [24] YUN S, HAN D, CHUN S, et al. CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features // Proc of the IEEE/CVF International Conference on Computer Vision. Wa-shington, USA: IEEE, 2019: 6022-6031. [25] ZHANG H Y, CISSE M, DAUPHIN Y N, et al. Mixup: Beyond Empirical Risk Management[C/OL].[2023-06-20]. https://arxiv.org/abs/1710.09412. [26] LI Q B, HE B S, SONG D.Model-Contrastive Federated Learning // Proc of the IEEE/CVF Conference on Computer Vision and Pat-tern Recognition. Washington, USA: IEEE, 2021: 10708-10717. [27] CHEN T, KORNBLITH S, NOROUZI M, et al. A Simple Framework for Contrastive Learning of Visual Representations // Proc of the 37th International Conference on Machine Learning. San Diego, USA: JMLR, 2020: 1597-1607. [28] ELAYAN H, ALOQAILY M, GUIZANI M.Deep Federated Learning for IoT-Based Decentralized Healthcare Systems // Proc of the International Wireless Communications and Mobile Computing. Washington, USA: IEEE, 2021: 105-109. [29] FRABONI Y, VIDAL R, KAMENI L, et al. Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning[C/OL].[2023-06-20]. https://arxiv.org/abs/2105.05883.. [30] CHEN M Z, SHLEZINGER N, POOR H V, et al. Communication-Efficient Federated Learning. Proceedings of the National Academy of Sciences, 2021, 118(17). DOI: 10.1073/pnas.202478911. [31] CHENG A D, WANG P S, ZHANG X S, et al. Differentially Private Federated Learning with Local Regularization and Sparsification // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2022: 10112-10121. [32] QIAO F C, PENG X.Uncertainty-Guided Model Generalization to Unseen Domains // Proc of the IEEE/CVF Conference on Compu-ter Vision and Pattern Recognition. Washington, USA: IEEE, 2021: 6786-6796. |
|
|
|