BiSNN: Training spiking neural networks with binary weights via Bayesian learning

Hyeryung Jang, Nicolas Skatchkovsky, Osvaldo Simeone

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Artificial Neural Network (ANN)-based inference on battery-powered devices can be made more energy-efficient by restricting the synaptic weights to be binary, hence eliminating the need to perform multiplications. An alternative, emerging, approach relies on the use of Spiking Neural Networks (SNNs), biologically inspired, dynamic, event-driven models that enhance energy efficiency via the use of binary, sparse, activations. In this paper, an SNN model is introduced that combines the benefits of temporally sparse binary activations and of binary weights. Two learning rules are derived, the first based on the combination of straight-through and surrogate gradient techniques, and the second based on a Bayesian paradigm. Experiments validate the performance loss with respect to full-precision implementations, and demonstrate the advantage of the Bayesian paradigm in terms of accuracy and calibration.

Original languageEnglish (US)
Title of host publication2021 IEEE Data Science and Learning Workshop, DSLW 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665428255
DOIs
StatePublished - Jun 5 2021
Event2021 IEEE Data Science and Learning Workshop, DSLW 2021 - Toronto, Canada
Duration: Jun 5 2021Jun 6 2021

Publication series

Name2021 IEEE Data Science and Learning Workshop, DSLW 2021

Conference

Conference2021 IEEE Data Science and Learning Workshop, DSLW 2021
Country/TerritoryCanada
CityToronto
Period6/5/216/6/21

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Information Systems
  • Education

Keywords

  • Bayesian learning
  • Binary weights
  • Calibration
  • Spiking Neural Networks

Fingerprint

Dive into the research topics of 'BiSNN: Training spiking neural networks with binary weights via Bayesian learning'. Together they form a unique fingerprint.

Cite this