TY - GEN
T1 - FedSDP
T2 - 41st IEEE International Conference on Data Engineering, ICDE 2025
AU - Moon, Jihoon
AU - Liu, Ling
AU - Kwon, Hyuk Yoon
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Federated learning (FL) is a privacy-preserving machine learning algorithm that enables multiple clients to collaborate. To respond to non-independent and identically distributed (non-IID) environments between clients, personalized FL (PFL) has been actively investigated. The typical PFL model consists of two parts: 1) the head (i.e., classifier) for the final classification and 2) the body (i.e., feature extractor) for extracting representations from local datasets. The head is maintained separately in each client for personalization; the body is aggregated for generalization. FedSDP introduces a bridge layer, called a personalized layer, between the head and the body to preserve individual, non-shared local prototypes for each client. A personalized layer decouples the body and head, strengthening the generalization and personalization, respectively. Based on this architecture, this study proposes a new PFL framework, Federated Self-Derived Prototypes (FedSDP), to dynamically balance personalization and generalization. To this end, we introduce two dynamic adjustments for generating self-derived prototypes: 1) global-local similarity weight (GL-Sim Weight) and 2) personalization early stopping indicator (P-Stop Indicator). GL-Sim Weight based on the similarity between the global and local prototypes is utilized to adjust the degree of personalization of each local model. PStop Indicator is calculated based on the changed degree of local parameters in each client, determining the early stopping for personalization in the client and further concentrating on generalization. Our comprehensive experiments demonstrate that FedSDP outperforms existing state-of-the-art FL frameworks, showing superior effectiveness in non-IID settings. Our code and data are available at https://github.com/bigbases/FedSDP.
AB - Federated learning (FL) is a privacy-preserving machine learning algorithm that enables multiple clients to collaborate. To respond to non-independent and identically distributed (non-IID) environments between clients, personalized FL (PFL) has been actively investigated. The typical PFL model consists of two parts: 1) the head (i.e., classifier) for the final classification and 2) the body (i.e., feature extractor) for extracting representations from local datasets. The head is maintained separately in each client for personalization; the body is aggregated for generalization. FedSDP introduces a bridge layer, called a personalized layer, between the head and the body to preserve individual, non-shared local prototypes for each client. A personalized layer decouples the body and head, strengthening the generalization and personalization, respectively. Based on this architecture, this study proposes a new PFL framework, Federated Self-Derived Prototypes (FedSDP), to dynamically balance personalization and generalization. To this end, we introduce two dynamic adjustments for generating self-derived prototypes: 1) global-local similarity weight (GL-Sim Weight) and 2) personalization early stopping indicator (P-Stop Indicator). GL-Sim Weight based on the similarity between the global and local prototypes is utilized to adjust the degree of personalization of each local model. PStop Indicator is calculated based on the changed degree of local parameters in each client, determining the early stopping for personalization in the client and further concentrating on generalization. Our comprehensive experiments demonstrate that FedSDP outperforms existing state-of-the-art FL frameworks, showing superior effectiveness in non-IID settings. Our code and data are available at https://github.com/bigbases/FedSDP.
KW - Balancing prototypes
KW - Early stopping of personalization
KW - Personalized federated learning
KW - Self-derived prototypes
UR - https://www.scopus.com/pages/publications/105015389908
U2 - 10.1109/ICDE65448.2025.00210
DO - 10.1109/ICDE65448.2025.00210
M3 - Conference contribution
AN - SCOPUS:105015389908
T3 - Proceedings - International Conference on Data Engineering
SP - 2796
EP - 2809
BT - Proceedings - 2025 IEEE 41st International Conference on Data Engineering, ICDE 2025
PB - IEEE Computer Society
Y2 - 19 May 2025 through 23 May 2025
ER -