CLC number: TP18
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2023-08-22
Cited: 0
Clicked: 1227
Citations: Bibtex RefMan EndNote GB/T7714
Zhaohui WANG, Hongjiao LI, Jinguo LI, Renhao HU, Baojin WANG. Federated learning on non-IID and long-tailed data via dual-decoupling[J]. Frontiers of Information Technology & Electronic Engineering, 2024, 25(5): 728-741.
@article{title="Federated learning on non-IID and long-tailed data via dual-decoupling",
author="Zhaohui WANG, Hongjiao LI, Jinguo LI, Renhao HU, Baojin WANG",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="25",
number="5",
pages="728-741",
year="2024",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2300284"
}
%0 Journal Article
%T Federated learning on non-IID and long-tailed data via dual-decoupling
%A Zhaohui WANG
%A Hongjiao LI
%A Jinguo LI
%A Renhao HU
%A Baojin WANG
%J Frontiers of Information Technology & Electronic Engineering
%V 25
%N 5
%P 728-741
%@ 2095-9184
%D 2024
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2300284
TY - JOUR
T1 - Federated learning on non-IID and long-tailed data via dual-decoupling
A1 - Zhaohui WANG
A1 - Hongjiao LI
A1 - Jinguo LI
A1 - Renhao HU
A1 - Baojin WANG
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 25
IS - 5
SP - 728
EP - 741
%@ 2095-9184
Y1 - 2024
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2300284
Abstract: federated learning (FL), a cutting-edge distributed machine learning training paradigm, aims to generate a global model by collaborating on the training of client models without revealing local private data. The co-occurrence of non-independent and identically distributed (non-IID) and long-tailed distribution in FL is one challenge that substantially degrades aggregate performance. In this paper, we present a corresponding solution called federated dual-decoupling via model and logit calibration (FedDDC) for non-IID and long-tailed distributions. The model is characterized by three aspects. First, we decouple the global model into the feature extractor and the classifier to fine-tune the components affected by the joint problem. For the biased feature extractor, we propose a client confidence re-weighting scheme to assist calibration, which assigns optimal weights to each client. For the biased classifier, we apply the classifier re-balancing method for fine-tuning. Then, we calibrate and integrate the client confidence re-weighted logits with the re-balanced logits to obtain the unbiased logits. Finally, we use decoupled knowledge distillation for the first time in the joint problem to enhance the accuracy of the global model by extracting the knowledge of the unbiased model. Numerous experiments demonstrate that on non-IID and long-tailed data in FL, our approach outperforms state-of-the-art methods.
[1]Alshammari S, Wang YX, Ramanan D, et al., 2022. Long- tailed recognition via weight balancing. IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.6887-6897.
[2]Bai JH, Liu ZZ, Wang HL, et al., 2023. On the effectiveness of out-of-distribution data in self-supervised long-tail learning. The Eleventh Int Conf on Learning Representations.
[3]Cao KD, Wei CL, Gaidon A, et al., 2019. Learning imbalanced datasets with label-distribution-aware margin loss. Proc 33rd Int Conf on Neural Information Processing Systems, p.1567-1578.
[4]Chen HY, Chao WL, 2021. FedBE: making Bayesian model ensemble applicable to federated learning.
[5]Chen ZH, Liu SS, Wang HL, et al., 2022. Towards federated long-tailed learning.
[6]Choudhury O, Park Y, Salonidis T, et al., 2019. Predicting adverse drug reactions on distributed health data using federated learning. American Medical Informatics Association Annual Symp Proc, p.313-322.
[7]Cui Y, Jia ML, Lin TY, et al., 2019. Class-balanced loss based on effective number of samples. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.9260-9269.
[8]Fallah A, Mokhtari A, Ozdaglar A, 2020. Personalized federated learning with theoretical guarantees: a model-agnostic meta-learning approach. Proc 34th Int Conf on Neural Information Processing Systems, p.3557-3568.
[9]Fang XW, Ye M, 2022. Robust federated learning with noisy and heterogeneous clients. IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.10062-10071.
[10]Han H, Wang WY, Mao BH, 2005. Borderline-SMOTE: a new over-sampling method in imbalanced data sets learning. Int Conf on Intelligent Computing, p.878-887.
[11]He HB, Garcia EA, 2009. Learning from imbalanced data. IEEE Trans Knowl Data Eng, 21(9):1263-1284.
[12]He KM, Zhang XY, Ren SQ, et al., 2016. Deep residual learning for image recognition. IEEE Conf on Computer Vision and Pattern Recognition, p.770-778.
[13]Hinton G, Vinyals O, Dean J, 2015. Distilling the knowledge in a neural network.
[14]Hsu TMH, Qi H, Brown M, 2019. Measuring the effects of non-identical data distribution for federated visual classification.
[15]Huang C, Li YN, Loy CC, et al., 2020. Deep imbalanced learning for face recognition and attribute prediction. IEEE Trans Patt Anal Mach Intell, 42(11):2781-2794.
[16]Jalalirad A, Scavuzzo M, Capota C, et al., 2019. A simple and efficient federated recommender system. Proc 6 th IEEE/ACM Int Conf on Big Data Computing, Applications and Technologies, p.53-58.
[17]Jiang ZY, Ren Y, Lei M, et al., 2021a. FedSpeech: federated text-to-speech with continual learning. Proc 30th Int Joint Conf on Artificial Intelligence, p.3829-3835.
[18]Jiang ZY, Chen TL, Chen T, et al., 2021b. Improving contrastive learning on imbalanced seed data via open-world sampling. https://arxiv.org/abs/2111.01004
[19]Kang BY, Xie SN, Rohrbach M, et al., 2020. Decoupling representation and classifier for long-tailed recognition.
[20]Karimireddy SP, Kale S, Mohri M, et al., 2020. SCAFFOLD: stochastic controlled averaging for federated learning. Proc 37 th Int Conf on Machine Learning, p.5132-5143.
[21]Krizhevsky A, 2009. Learning Multiple Layers of Features from Tiny Images. Technical Report, TR-2009. University of Toronto, Toronto, Canada.
[22]Lan L, Zhang DC, Li XC, 2022. Aligning model outputs for class imbalanced non-IID federated learning. Mach Learn, 113:1861-1884.
[23]Lecun Y, Bottou L, Bengio Y, et al., 1998. Gradient-based learning applied to document recognition. Proc IEEE, 86(11):2278-2324.
[24]Lee G, Jeong M, Shin Y, et al., 2022. Preservation of the global knowledge by not-true distillation in federated learning.
[25]Li DL, Wang JP, 2019. FedMD: heterogenous federated learning via model distillation.
[26]Li T, Sahu AK, Zaheer M, et al., 2020. Federated optimization in heterogeneous networks.
[27]Li X, Huang KX, Yang WH, et al., 2020. On the convergence of FedAvg on non-IID data.
[28]Li XC, Zhan DC, 2021. FedRS: federated learning with restricted softmax for label distribution non-IID data. Proc 27 th ACM SIGKDD Conf on Knowledge Discovery & Data Mining, p.995-1005.
[29]Li XC, Zhan DC, Shao YF, et al., 2021. FedPHP: federated personalization with inherited private models. European Conf on Machine Learning and Knowledge Discovery in Databases, p.587-602.
[30]Lin T, Kong LJ, Stich SU, et al., 2020. Ensemble distillation for robust model fusion in federated learning. Proc 34 th Int Conf on Neural Information Processing Systems, p.2351-2363.
[31]Lin TY, Goyal P, Girshick R, et al., 2017. Focal loss for dense object detection. IEEE Int Conf on Computer Vision, p.2999-3007.
[32]Luo M, Chen F, Hu DP, et al., 2021. No fear of heterogeneity: classifier calibration for federated learning with non-IID data.
[33]McMahan B, Moore E, Ramage D, et al., 2017. Communication-efficient learning of deep networks from decentralized data. Proc 20 th Int Conf on Artificial Intelligence and Statistics, p.1273-1282.
[34]Pouyanfar S, Tao YD, Mohan A, et al., 2018. Dynamic sampling in convolutional neural networks for imbalanced data classification. IEEE Conf on Multimedia Information Processing and Retrieval, p.112-117.
[35]Sarkar D, Narang A, Rai S, 2020. Fed-Focal Loss for imbalanced data classification in federated learning.
[36]Shang XY, Lu Y, Huang G, et al., 2022a. Federated learning on heterogeneous and long-tailed data via classifier re-training with federated features. Proc 31st Int Joint Conf on Artificial Intelligence, p.2218-2224.
[37]Shang XY, Lu Y, Cheung YM, et al., 2022b. FEDIC: federated learning on non-IID and long-tailed data via calibrated distillation. IEEE Int Conf on Multimedia and Expo, p.1-6.
[38]Shen YH, Wang HX, Lv HR, 2023. Federated learning with classifier shift for class imbalance.
[39]Shen ZB, Cervino J, Hassani H, et al., 2022. An agnostic approach to federated learning with class imbalance. The 10 th Int Conf on Learning Representations.
[40]Shoham N, Avidor T, Keren A, et al., 2019. Overcoming forgetting in federated learning on non-IID data.
[41]Tan B, Liu B, Zheng V, et al., 2020. A federated recommender system for online services. Proc 14 th ACM Conf on Recommender Systems, p.579-581.
[42]Wang JY, Liu QH, Liang H, et al., 2020. Tackling the objective inconsistency problem in heterogeneous federated optimization. Proc 34 th Int Conf on Neural Information Processing Systems, p.7611-7623.
[43]Wang LX, Xu SC, Wang X, et al., 2021. Addressing class imbalance in federated learning. Proc AAAI Conf Artif Intell, 35(11):10165-10173.
[44]Yang WK, Chen DL, Zhou H, et al., 2023. Integrating local real data with global gradient prototypes for classifier re-balancing in federated long-tailed learning.
[45]Yu FX, Rawat AS, Menon AK, et al., 2020. Federated learning with only positive labels. Proc 37 th Int Conf on Machine Learning, p.10946-10956.
[46]Yurochkin M, Agarwal M, Ghosh S, et al., 2019. Bayesian nonparametric federated learning of neural networks. Proc 36 th Int Conf on Machine Learning, p.7252-7261.
[47]Zeng YP, Liu L, Wu BY, et al., 2023. Label-distribution-agnostic ensemble learning on federated long-tailed data. Int Conf on Learning Representations.
[48]Zhang SY, Li ZM, Yan SP, et al., 2021. Distribution alignment: a unified framework for long-tail visual recognition. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.2361-2370.
[49]Zhao BR, Cui Q, Song RJ, et al., 2022. Decoupled knowledge distillation. IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.11943-11952.
[50]Zhao Y, Li M, Lai LZ, et al., 2022. Federated learning with non-IID data.
[51]Zhou BY, Cui Q, Wei XS, et al., 2020. BBN: bilateral-branch network with cumulative learning for long-tailed visual recognition. Proc IEEE/CVF Conf on Computer Vision and Pattern Recognition, p.9716-9725.
[52]Zhu ZD, Hong JY, Zhou JY, 2021. Data-free knowledge distillation for heterogeneous federated learning. Proc 38th Int Conf on Machine Learning, p.12878-12889.
Open peer comments: Debate/Discuss/Question/Opinion
<1>