Thanks to visit codestin.com
Credit goes to github.com

Skip to content
/ FedFSA Public

[AAAI 2025] Flexible Sharpness-Aware Personalized Federated Learning

License

Notifications You must be signed in to change notification settings

xxdznl/FedFSA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Flexible Sharpness-Aware Personalized Federated Learning (AAAI 2025)

Xinda Xing*1, Qiugang Zhan*2 , Xiurui Xie†1, Yuning Yang1, Qiang Wang3, Guisong Liu†2

1 University of Electronic Science and Technology of China, 2 Southwest University of Finance and Economics,3 Sun Yat-sen University.

Contents

Overview

Personalized federated learning (PFL) is a new paradigm to address the statistical heterogeneity problem in federated learning. Most existing PFL methods focus on leveraging global and local information such as model interpolation or parameter decoupling. However, these methods often overlook the generalization potential during local client learning. From a local optimization perspective, we propose a simple and general PFL method, Federated learning with Flexible Sharpness-Aware Minimization (FedFSA). Specifically, we emphasize the importance of applying a larger perturbation to critical layers of the local model when using the Sharpness-Aware Minimization (SAM) optimizer. Then, we design a metric, perturbation sensitivity, to estimate the layer-wise sharpness of each local model. Based on this metric, FedFSA can flexibly select the layers with the highest sharpness to employ larger perturbation. The results show that FedFSA outperforms seven baselines by up to 8.26% in test accuracy.

Baselines

FedAvg:Communication-Efficient Learning of Deep Networks from Decentralized Data

FedCR:Communication-Efficient Learning of Deep Networks from Decentralized Data

FedALA:FedALA: Adaptive Local Aggregation for Personalized Federated Learning

FedSAM/MoFedSAM:Generalized Federated Learning via Sharpness Aware Minimization

FedSpeed:FedSpeed: Larger Local Interval, Less Communication Round, and Higher Generalization Accuracy

FedSMOO:Dynamic Regularized Sharpness Aware Minimization in Federated Learning: Approaching Global Consistency and Smooth Landscape

QuickStart

Refer to the ./FedFSA/run.sh script for basic usage. For detailed hyperparameter configurations, please refer to our paper and Appendix.

Citation

@inproceedings{xing2025fedfsa,
    title={Flexible Sharpness-Aware Personalized Federated Learning},
    author={Xing, Xinda and Zhan, Qiugang and Xie, Xiurui and Yang, Yuning and Wang, Qiang and Liu, Guisong},
    booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
    volume={39},
    number={20},
    pages={21707--21715},
    year={2025},
    address = {Philadelphia, Pennsylvania, USA}
}

Acknowledgments

This repo benefits from FedSpeed,FedSMOO and FedCR. Thanks for their wonderful works!

About

[AAAI 2025] Flexible Sharpness-Aware Personalized Federated Learning

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published