Full Text:   <266>

CLC number: 

On-line Access: 2023-09-11

Received: 2023-04-23

Revision Accepted: 2023-08-22

Crosschecked: 0000-00-00

Cited: 0

Clicked: 321

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Journal of Zhejiang University SCIENCE C 1998 Vol.-1 No.-1 P.

http://doi.org/10.1631/FITEE.2300284


Federated learning on non-IID and long-tailed data via dual-decoupling


Author(s):  Zhaohui WANG, Hongjiao LI, Jinguo LI, Renhao HU, Baojin WANG

Affiliation(s):  College of Computer Science and Technology, Shanghai University of Electric Power, Shanghai 201306, China

Corresponding email(s):   hjli@shiep.edu.cn, lijg@shiep.edu.cn

Key Words:  Federated learning, Non-IID, Long-tailed data, Decoupling learning, Knowledge distillation


Zhaohui WANG, Hongjiao LI, Jinguo LI, Renhao HU, Baojin WANG. Federated learning on non-IID and long-tailed data via dual-decoupling[J]. Frontiers of Information Technology & Electronic Engineering, 1998, -1(-1): .

@article{title="Federated learning on non-IID and long-tailed data via dual-decoupling",
author="Zhaohui WANG, Hongjiao LI, Jinguo LI, Renhao HU, Baojin WANG",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="-1",
number="-1",
pages="",
year="1998",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2300284"
}

%0 Journal Article
%T Federated learning on non-IID and long-tailed data via dual-decoupling
%A Zhaohui WANG
%A Hongjiao LI
%A Jinguo LI
%A Renhao HU
%A Baojin WANG
%J Journal of Zhejiang University SCIENCE C
%V -1
%N -1
%P
%@ 2095-9184
%D 1998
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2300284

TY - JOUR
T1 - Federated learning on non-IID and long-tailed data via dual-decoupling
A1 - Zhaohui WANG
A1 - Hongjiao LI
A1 - Jinguo LI
A1 - Renhao HU
A1 - Baojin WANG
J0 - Journal of Zhejiang University Science C
VL - -1
IS - -1
SP -
EP -
%@ 2095-9184
Y1 - 1998
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2300284


Abstract: 
federated learning (FL), a cutting-edge distributed machine learning training paradigm, aims to generate a global model by collaborating on the training of client models without revealing local private data. The cooccurrence of non-IID and long-tailed distribution in FL is one challenge that substantially degrades aggregate performance. In this paper, we present a corresponding solution called federated dual- decoupling via model and logits calibration (FedDDC) for non-IID and long-tailed distributions. The model is characterized by the following three aspects. (1) We decouple the global model into the feature extractor and the classifier to fine-tune the components affected by the joint problem. For the biased feature extractor, we propose a client confidence reweighting scheme to assist calibration, which assigns optimal weights to each client. For the biased classifier, we apply the classifier re-balancing method for fine-tuning. (2) We calibrate and integrate the client confidence reweighted logits with the re-balanced logits to obtain the unbiased logits. (3) We use decoupled knowledge distillation for the first time in the joint problem to enhance the accuracy of the global model by extracting the knowledge of the unbiased model. Numerous experiments demonstrate that on non-IID and long-tailed data in FL, our approach outperforms state-of-the-art methods.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE