Differentially-Private Decentralized Learning in Heterogeneous Multicast Networks

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a power-controlled differentially private decentralized learning algorithm designed for a set of clients aiming to collaboratively train a common learning model. The network is characterized by a row-stochastic adjacency matrix, which reflects different channel gains between the clients. In our privacy-preserving approach, both the transmit power for model updates and the level of injected Gaussian noise are jointly controlled to satisfy a given privacy and energy budget. We show that our proposed algorithm achieves a convergence rate of O(log T), where T is the horizon bound in the regret function. Furthermore, our numerical results confirm that our proposed algorithm outperforms existing works.

Original languageEnglish (US)
Title of host publicationISIT 2025 - 2025 IEEE International Symposium on Information Theory, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331543990
DOIs
StatePublished - 2025
Event2025 IEEE International Symposium on Information Theory, ISIT 2025 - Ann Arbor, United States
Duration: Jun 22 2025Jun 27 2025

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8095

Conference

Conference2025 IEEE International Symposium on Information Theory, ISIT 2025
Country/TerritoryUnited States
CityAnn Arbor
Period6/22/256/27/25

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Differentially-Private Decentralized Learning in Heterogeneous Multicast Networks'. Together they form a unique fingerprint.

Cite this