MimiC: Combating Client Dropouts in Federated Learning by Mimicking Central Updates

Guardat en:
Dades bibliogràfiques
Publicat a:arXiv.org (Apr 8, 2024), p. n/a
Autor principal: Sun, Yuchang
Altres autors: Mao, Yuyi, Zhang, Jun
Publicat:
Cornell University Library, arXiv.org
Matèries:
Accés en línia:Citation/Abstract
Full text outside of ProQuest
Etiquetes: Afegir etiqueta
Sense etiquetes, Sigues el primer a etiquetar aquest registre!

MARC

LEADER 00000nab a2200000uu 4500
001 2828555630
003 UK-CbPIL
022 |a 2331-8422 
024 7 |a 10.1109/TMC.2023.3338021  |2 doi 
035 |a 2828555630 
045 0 |b d20240408 
100 1 |a Sun, Yuchang 
245 1 |a MimiC: Combating Client Dropouts in Federated Learning by Mimicking Central Updates 
260 |b Cornell University Library, arXiv.org  |c Apr 8, 2024 
513 |a Working Paper 
520 3 |a Federated learning (FL) is a promising framework for privacy-preserving collaborative learning, where model training tasks are distributed to clients and only the model updates need to be collected at a server. However, when being deployed at mobile edge networks, clients may have unpredictable availability and drop out of the training process, which hinders the convergence of FL. This paper tackles such a critical challenge. Specifically, we first investigate the convergence of the classical FedAvg algorithm with arbitrary client dropouts. We find that with the common choice of a decaying learning rate, FedAvg oscillates around a stationary point of the global loss function, which is caused by the divergence between the aggregated and desired central update. Motivated by this new observation, we then design a novel training algorithm named MimiC, where the server modifies each received model update based on the previous ones. The proposed modification of the received model updates mimics the imaginary central update irrespective of dropout clients. The theoretical analysis of MimiC shows that divergence between the aggregated and central update diminishes with proper learning rates, leading to its convergence. Simulation results further demonstrate that MimiC maintains stable convergence performance and learns better models than the baseline methods. 
653 |a Algorithms 
653 |a Divergence 
653 |a Convergence 
653 |a Clients 
653 |a Servers 
653 |a Federated learning 
653 |a Iterative methods 
653 |a Edge computing 
700 1 |a Mao, Yuyi 
700 1 |a Zhang, Jun 
773 0 |t arXiv.org  |g (Apr 8, 2024), p. n/a 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/2828555630/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full text outside of ProQuest  |u http://arxiv.org/abs/2306.12212