TY - GEN
T1 - FedGMKD
T2 - 38th Conference on Neural Information Processing Systems, NeurIPS 2024
AU - Zhang, Jianqiao
AU - Shan, Caifeng
AU - Han, Jungong
N1 - Publisher Copyright:
© 2024 Neural information processing systems foundation. All rights reserved.
PY - 2024
Y1 - 2024
N2 - Federated Learning (FL) faces significant challenges due to data heterogeneity across distributed clients. To address this, we propose FedGMKD, a novel framework that combines knowledge distillation and differential aggregation for efficient prototype-based personalized FL without the need for public datasets or server-side generative models. FedGMKD introduces Cluster Knowledge Fusion, utilizing Gaussian Mixture Models to generate prototype features and soft predictions on the client side, enabling effective knowledge distillation while preserving data privacy. Additionally, we implement a Discrepancy-Aware Aggregation Technique that weights client contributions based on data quality and quantity, enhancing the global model's generalization across diverse client distributions. Theoretical analysis confirms the convergence of FedGMKD. Extensive experiments on benchmark datasets, including SVHN, CIFAR-10, and CIFAR-100, demonstrate that FedGMKD outperforms state-of-the-art methods, significantly improving both local and global accuracy in Non-IID data settings.
AB - Federated Learning (FL) faces significant challenges due to data heterogeneity across distributed clients. To address this, we propose FedGMKD, a novel framework that combines knowledge distillation and differential aggregation for efficient prototype-based personalized FL without the need for public datasets or server-side generative models. FedGMKD introduces Cluster Knowledge Fusion, utilizing Gaussian Mixture Models to generate prototype features and soft predictions on the client side, enabling effective knowledge distillation while preserving data privacy. Additionally, we implement a Discrepancy-Aware Aggregation Technique that weights client contributions based on data quality and quantity, enhancing the global model's generalization across diverse client distributions. Theoretical analysis confirms the convergence of FedGMKD. Extensive experiments on benchmark datasets, including SVHN, CIFAR-10, and CIFAR-100, demonstrate that FedGMKD outperforms state-of-the-art methods, significantly improving both local and global accuracy in Non-IID data settings.
UR - https://www.scopus.com/pages/publications/105000509827
M3 - Conference Proceeding (ISBN)
AN - SCOPUS:105000509827
VL - 37
T3 - Advances in Neural Information Processing Systems
BT - Advances in Neural Information Processing Systems
A2 - Globerson, A.
A2 - Mackey, L.
A2 - Belgrave, D.
A2 - Fan, A.
A2 - Paquet, U.
A2 - Tomczak, J.
A2 - Zhang, C.
PB - Neural Information Processing Systems Foundation
Y2 - 9 December 2024 through 15 December 2024
ER -