MimiC: Combating Client Dropouts in Federated Learning by Mimicking Central Updates

Gardado en:
Detalles Bibliográficos
Publicado en:arXiv.org (Apr 8, 2024), p. n/a
Autor Principal: Sun, Yuchang
Outros autores: Mao, Yuyi, Zhang, Jun
Publicado:
Cornell University Library, arXiv.org
Materias:
Acceso en liña:Citation/Abstract
Full text outside of ProQuest
Etiquetas: Engadir etiqueta
Sen Etiquetas, Sexa o primeiro en etiquetar este rexistro!
Descripción
Resumo:Federated learning (FL) is a promising framework for privacy-preserving collaborative learning, where model training tasks are distributed to clients and only the model updates need to be collected at a server. However, when being deployed at mobile edge networks, clients may have unpredictable availability and drop out of the training process, which hinders the convergence of FL. This paper tackles such a critical challenge. Specifically, we first investigate the convergence of the classical FedAvg algorithm with arbitrary client dropouts. We find that with the common choice of a decaying learning rate, FedAvg oscillates around a stationary point of the global loss function, which is caused by the divergence between the aggregated and desired central update. Motivated by this new observation, we then design a novel training algorithm named MimiC, where the server modifies each received model update based on the previous ones. The proposed modification of the received model updates mimics the imaginary central update irrespective of dropout clients. The theoretical analysis of MimiC shows that divergence between the aggregated and central update diminishes with proper learning rates, leading to its convergence. Simulation results further demonstrate that MimiC maintains stable convergence performance and learns better models than the baseline methods.
ISSN:2331-8422
DOI:10.1109/TMC.2023.3338021
Fonte:Engineering Database