Compressed Particle-Based Federated Bayesian Learning and Unlearning

Jinu Gong, Osvaldo Simeone, Joonhyuk Kang

Research output: Contribution to journalArticlepeer-review

Abstract

Conventional frequentist federated learning (FL) schemes are known to yield overconfident decisions. Bayesian FL addresses this issue by allowing agents to process and exchange uncertainty information encoded in distributions over the model parameters. However, this comes at the cost of a larger per-iteration communication overhead. This letter investigates whether Bayesian FL can still provide advantages in terms of calibration when constraining communication bandwidth. We present compressed particle-based Bayesian FL protocols for FL and federated “unlearning" that apply quantization and sparsification across multiple particles. The experimental results confirm that the benefits of Bayesian FL are robust to bandwidth constraints.

Original languageEnglish (US)
Pages (from-to)1
Number of pages1
JournalIEEE Communications Letters
DOIs
StateAccepted/In press - 2022
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Modeling and Simulation
  • Computer Science Applications
  • Electrical and Electronic Engineering

Keywords

  • Bayes methods
  • Bayesian learning
  • Federated learning
  • Federated learning
  • Machine unlearning
  • Protocols
  • Quantization (signal)
  • Servers
  • Stein variational gradient descent
  • Training
  • Uncertainty
  • Wireless communication

Fingerprint

Dive into the research topics of 'Compressed Particle-Based Federated Bayesian Learning and Unlearning'. Together they form a unique fingerprint.

Cite this