|
|
Personalized Federated Learning Based on Sparsity Regularized Bi-level Optimization |
LIU Xi1, LIU Bo2, JI Fanfan3, YUAN Xiaotong4,5 |
1. School of Computer Science, Nanjing University of Information Science and Technology, Nanjing 210044; 2. Walmart Global Tech Hub, Sunnyvale, CA 94086, USA; 3. School of Electronics and Information Engineering, Nanjing University of Information Science and Technology, Nanjing 210044; 4. State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023; 5. School of Intelligence Science and Technology, Nanjing University, Suzhou 215163 |
|
|
Abstract Personalized federated learning focuses on providing personalized model for each client, aiming to improve the processing performance on statistically heterogeneous data. However, most existing personalized federated learning algorithms enhance the performance of personalized models at the cost of increasing the number of client parameters and making computation more complex. To address this issue, a personalized federated learning algorithm based on sparsity regularized bi-level optimization(pFedSRB) is proposed in this paper. The l1 norm sparse regularization is introduced into the personalized update of each client to enhance the sparsity of the personalized model, avoid unnecessary parameter updates of clients, and reduce model complexity. The personalized federated learning problem is formulated as a bi-level optimization problem, and the inner-level optimization of pFedSRB is solved by the alternating direction method of multipliers to improve the learning speed. Experiments on four federated learning benchmark datasets demonstrate that pFedSRB performs well on heterogeneous data , effectively improving model performance while reducing the time and memory costs required for training.
|
Received: 29 January 2024
|
|
Fund:National Natural Science Foundation of China(No.U21B2049,61936005), National Key Research and Development Program of China(No.2018AAA0100400) |
Corresponding Authors:
YUAN Xiaotong, Ph.D., professor. His research interests include machine learning,stochastic optimization and computer vision.
|
About author:: LIU Xi, Master student. Her research interests include federated learning, transfer learning and pattern recognition. LIU Bo, Ph.D. His research interests include machine learning theory and application. JI Fanfan, Ph.D. candidate. His research interests include pattern recognition, transfer learning and few-shot learning. |
|
|
|
[1] MCMAHAN B, MOORE E, RAMAGE D, et al. Communication-Efficient Learning of Deep Networks from Decentralized Data. Journal of Machine Learning Research, 2017, 54: 1273-1282. [2] KAIROUZ P, MCMAHAN H B, AVENT B, et al. Advances and Open Problems in Federated Learning. Foundations and Trends? in Machine Learning, 2021, 14(1/2): 1-210. [3] ZHU H Y, XU J J, LIU S Q, et al. Federated Learning on Non-IID Data: A Survey. Neurocomputing, 2021, 465: 371-390. [4] KULKARNI V, KULKARNI M, PANT A. Survey of Personalization Techniques for Federated Learning // Proc of the 4th World Confe-rence on Smart Trends in Systems, Security and Sustainability. Washington, USA: IEEE, 2020: 794-797. [5] TAN A Z, YU H, CUI L Z, et al. Towards Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(12): 9587-9603. [6] FALLAH A, MOKHTARI A, OZDAGLAR A E. Personalized Fe-derated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach // Proc of the 34th International Confe-rence on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2020: 3557-3568. [7] LI T, HU S Y, BEIRAMI A, et al. Ditto: Fair and Robust Federated Learning through Personalization. Journal of Machine Learning Research, 2021, 139: 6357-6368. [8] DINH C T, TRAN N H, NGUYEN T D. Personalized Federated Lear-ning with Moreau Envelopes // Proc of the 34th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2020: 21394-21405. [9] YE M, FANG X W, DU B, et al. Heterogeneous Federated Lear-ning: State-of-the-Art and Research Challenges. ACM Computing Surveys, 2023, 56(3). DOI: 10.1145/3625558. [10] MA X D, ZHU J, LIN Z H, et al. A State-of-the-Art Survey on Solving Non-IID Data in Federated Learning. Future Generation Computer Systems, 2022, 135: 244-258. [11] SMITH V, CHIANG C K, SANJABI M, et al. Federated Multi-task Learning // Proc of the 31st International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2017: 4427-4437. [12] YE C Y, ZHENG H, HU Z G, et al. PFedSA: Personalized Fede-rated Multi-task Learning via Similarity Awareness // Proc of the IEEE International Parallel and Distributed Processing Symposium. Washington, USA: IEEE, 2023: 480-488. [13] VANHAESEBROUCK P, BELLET A, TOMMASI M. Decentra-lized Collaborative Learning of Personalized Models over Networks. Journal of Machine Learning Research, 2017, 54: 509-517. [14] HUANG T S, SHEN L, SUN Y, et al. Fusion of Global and Local Knowledge for Personalized Federated Learning[C/OL].[2023-12-15]. https://openreview.net/pdf?id=QtrjqVIZna. [15] HUANG Y T, CHU L Y, ZHOU Z R, et al. Personalized Cross-Silo Federated Learning on Non-IID Data. Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(9): 7865-7873. [16] XU J, TONG X, HUANG S L. Personalized Federated Learning with Feature Alignment and Classifier Collaboration[C/OL]. [2023-12-15].https://openreview.net/pdf?id=SXZr8aDKia. [17] SABAH F, CHEN Y W, YANG Z, et al. Model Optimization Techniques in Personalized Federated Learning: A Survey. Expert Systems with Applications, 2023. DOI: 10.1016/j.eswa.2023.122874. [18] BOYD S, PARIKH N, CHU E, et al. Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers. Foundations and Trends? in Machine Learning, 2011, 3(1): 1-122. [19] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-Based Lear-ning Applied to Document Recognition. Proceedings of the IEEE, 1998, 86(11): 2278-2324. [20] XIAO H, RASUL K, VOLLGRAF R. Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms[C/OL]. [2023-12-15].https://arxiv.org/pdf/1708.07747v2. [21] KRIZHEVSKY A. Learning Multiple Layers of Features from Tiny Images[C/OL]. [2023-12-15].https://www.cs.toronto.edu/~kriz/learning-features-2009-TR.pdf. [22] LI T, SAHU A K, ZAHEER M, et al. Federated Optimization in Heterogeneous Networks. Proceedings of Machine Learning and Systems, 2020, 2: 429-450. [23] KARIMIREDDY S P, KALE S, MOHRI M, et al. SCAFFOLD: Stochastic Controlled Averaging for Federated Learning // Proc of the 37th International Conference on Machine Learning. San Diego, USA: JMLR, 2020: 5132-5143. [24] YAO X, SUN L F. Continual Local Training for Better Initialization of federated Models // Proc of the IEEE International Confe-rence on Image Processing. Washington, USA: IEEE, 2020: 1736-1740. [25] 张珉,梁美玉,薛哲,等. 基于结构增强的异质数据联邦学习模型正则优化算法.模式识别与人工智能, 2023, 36(9): 856-865. (ZHANG M, LIANG M Y, XUE Z, et al. Regularization Optimization Algorithm for Heterogeneous Data Federated Learning Model Based on Structure Enhancement. Pattern Recognition and Artificial Intelligence, 2023, 36(9): 856-865.) [26] SHI Y, ZHANG Y Y, ZHANG P, et al. Federated Learning with l1 Regularization. Pattern Recognition Letters, 2023, 172: 15-21. [27] DENG Y Y, KAMANI M M, MAHDAVI M. Adaptive Personalized Federated Learning[C/OL]. [2023-12-15].https://arxiv.org/pdf/2003.13461. [28] LIN S Y, HAN Y Z, LI X, et al. Personalized Federated Learning towards Communication Efficiency, Robustness and Fairness[C/OL].[2023-12-15]. https://openreview.net/pdf?id=wFymjzZEEkH. [29] ZHANG H, LI C L, DAI W R, et al. FedCR: Personalized Federa-ted Learning Based on Across-Client Common Representation with Conditional Mutual Information Regularization. Journal of Machine Learning Research, 2023, 202: 41314-41330. [30] YANG J, ZHANG L, XU Y, et al. Beyond Sparsity: The Role of L1-Optimizer in Pattern Classification. Pattern Recognition, 2012, 45(3): 1104-1118. [31] LIU X F, LI Y C, WANG Q, et al. Sparse Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems, 2023. DOI: 10.1109/TNNLS.2023.3250658. [32] JI S X, TAN Y, SARAVIRTA T, et al. Emerging Trends in Fede-rated Learning: From Model Fusion to Federated X Learning. International Journal of Machine Learning and Cybernetics, 2024. DOI: 10.1007/s13042-024-02119-1. [33] CARUANA R. Multitask Learning. Machine Learning, 1997, 28(1): 41-75. [34] MORTAHEB M, VAHAPOGLU C, ULUKUS S. FedGradNorm: Personalized Federated Gradient-Normalized Multi-task Learning // Proc of the IEEE 23rd International Workshop on Signal Processing Advances in Wireless Communication. Washington, USA: IEEE, 2022. DOI: 10.1109/SPAWC51304.2022.9833969. [35] EVGENIOU T, PONTIL M. Regularized Multi-task Learning // Proc of the 10th ACM SIGKDD International Conference on Know-ledge Discovery and Data Mining. New York, USA: ACM, 2004: 109-117. [36] MARFOQ O, NEGLIA G, BELLET A, et al. Federated Multi-task Learning under a Mixture of Distributions[C/OL].[2023-12-15]. https://arxiv.org/pdf/2108.10252v1. [37] DINH C T, VU T T, TRAN N H, et al. FedU: A Unified Framework for Federated Multi-task Learning with Laplacian Regularization[C/OL].[2023-12-15]. https://arxiv.org/pdf/2102.07148v1. [38] FINN C, ABBEEL P, LEVINE S. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks // Proc of the 34th International Conference on Machine Learning. San Diego, USA: JMLR, 2017: 1126-1135. [39] JIANG Y H, KOŇECNŶJ, RUSH K, et al. Improving Federated Learning Personalization via Model Agnostic Meta Learning[C/OL].[2023-12-15]. https://openreview.net/pdf?id=BkeaEyBYDB. [40] VETTORUZZO A, BOUGUELIA M R, RÖGNVALDSSON T. Personalized Federated Learning with Contextual Modulation and Meta-Learning // Proc of the SIAM International Conference on Data Mining. Philadelphia, USA: SIAM, 2024: 842-850. [41] WANG B K, YUAN Z N, YING Y M, et al. Memory-Based Optimization Methods for Model-Agnostic Meta-Learning and Persona-lized Federated Learning. Journal of Machine Learning Research, 2023, 24(1): 6926-6971. [42] LIU Y, KANG Y, XING C P, et al. A Secure Federated Transfer Learning Framework. IEEE Intelligent Systems, 2020, 35(4): 70-82. [43] FENG S W, LI B Y, YU H, et al. Semi-supervised Federated Heterogeneous Transfer Learning. Knowledge-Based Systems, 2022, 252. DOI: 10.1016/j.knosys.2022.109384. [44] WU J, BAO W X, AINSWORTH E, et al. Personalized Federated Learning with Parameter Propagation // Proc of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2023: 2594-2605. [45] ZHANG J, GUO S, MA X S, et al. Parameterized Knowledge Transfer for Personalized Federated Learning[C/OL].[2023-12-15]. https://arxiv.org/abs/2111.02862v1. [46] WU C H, WU F Z, LYU L J, et al. Communication-Efficient Fede-rated Learning via Knowledge Distillation. Nature Communications, 2022, 13(1). DOI: 10.1038/s41467-022-29763-x. [47] CHEN Y Q, LU W, QIN X, et al. MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Persona-lized Healthcare. IEEE Transactions on Neural Networks and Learning Systems, 2023. DOI: 10.1109/TNNLS.2023.3297103. [48] LEE G, JEONG M, SHIN Y, et al. Preservation of the Global Knowledge by Not-True Distillation in Federated Learning // Proc of the 36th International Conference on Neural Information Proce-ssing Systems. Cambridge, USA: MIT Press, 2022: 38461-38474. [49] ZHANG J Q, HUA Y, WANG H, et al. FedALA: Adaptive Local Aggregation for Personalized Federated Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 2023, 37(9): 11237-11244. |
|
|
|