FedADMM-InSa: an inexact and self-adaptive ADMM for federated learning

dc.contributor.authorSong, Yongcun
dc.contributor.authorWang, Ziqi
dc.contributor.authorZuazua, Enrique
dc.date.accessioned2025-02-17T13:16:16Z
dc.date.available2025-02-17T13:16:16Z
dc.date.issued2025-01
dc.date.updated2025-02-17T13:16:16Z
dc.description.abstractFederated learning (FL) is a promising framework for learning from distributed data while maintaining privacy. The development of efficient FL algorithms encounters various challenges, including heterogeneous data and systems, limited communication capacities, and constrained local computational resources. Recently developed FedADMM methods show great resilience to both data and system heterogeneity. However, they still suffer from performance deterioration if the hyperparameters are not carefully tuned. To address this issue, we propose an inexact and self-adaptive FedADMM algorithm, termed FedADMM-InSa. First, we design an inexactness criterion for the clients’ local updates to eliminate the need for empirically setting the local training accuracy. This inexactness criterion can be assessed by each client independently based on its unique condition, thereby reducing the local computational cost and mitigating the undesirable straggle effect. The convergence of the resulting inexact ADMM is proved under the assumption of strongly convex loss functions. Additionally, we present a self-adaptive scheme that dynamically adjusts each client's penalty parameter, enhancing algorithm robustness by mitigating the need for empirical penalty parameter choices for each client. Extensive numerical experiments on both synthetic and real-world datasets have been conducted. As validated by some tests, our FedADMM-InSa algorithm improves model accuracy by 7.8% while reducing clients’ local workloads by 55.7% compared to benchmark algorithms.en
dc.identifier.citationSong, Y., Wang, Z., & Zuazua, E. (2025). FedADMM-InSa: An inexact and self-adaptive ADMM for federated learning. Neural Networks, 181. https://doi.org/10.1016/J.NEUNET.2024.106772
dc.identifier.doi10.1016/J.NEUNET.2024.106772
dc.identifier.eissn1879-2782
dc.identifier.issn0893-6080
dc.identifier.urihttp://hdl.handle.net/20.500.14454/2316
dc.language.isoeng
dc.publisherElsevier Ltd
dc.rights© 2024 The Authors
dc.subject.otherADMM
dc.subject.otherClient heterogeneity
dc.subject.otherFederated learning
dc.subject.otherInexactness criterion
dc.titleFedADMM-InSa: an inexact and self-adaptive ADMM for federated learningen
dc.typejournal article
dcterms.accessRightsopen access
oaire.citation.titleNeural Networks
oaire.citation.volume181
oaire.licenseConditionhttps://creativecommons.org/licenses/by-nc/4.0/
oaire.versionVoR
Archivos
Bloque original
Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
song_fedADMM_2025.pdf
Tamaño:
1.18 MB
Formato:
Adobe Portable Document Format
Colecciones