TY - GEN
T1 - BiSNN
T2 - 2021 IEEE Data Science and Learning Workshop, DSLW 2021
AU - Jang, Hyeryung
AU - Skatchkovsky, Nicolas
AU - Simeone, Osvaldo
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/6/5
Y1 - 2021/6/5
N2 - Artificial Neural Network (ANN)-based inference on battery-powered devices can be made more energy-efficient by restricting the synaptic weights to be binary, hence eliminating the need to perform multiplications. An alternative, emerging, approach relies on the use of Spiking Neural Networks (SNNs), biologically inspired, dynamic, event-driven models that enhance energy efficiency via the use of binary, sparse, activations. In this paper, an SNN model is introduced that combines the benefits of temporally sparse binary activations and of binary weights. Two learning rules are derived, the first based on the combination of straight-through and surrogate gradient techniques, and the second based on a Bayesian paradigm. Experiments validate the performance loss with respect to full-precision implementations, and demonstrate the advantage of the Bayesian paradigm in terms of accuracy and calibration.
AB - Artificial Neural Network (ANN)-based inference on battery-powered devices can be made more energy-efficient by restricting the synaptic weights to be binary, hence eliminating the need to perform multiplications. An alternative, emerging, approach relies on the use of Spiking Neural Networks (SNNs), biologically inspired, dynamic, event-driven models that enhance energy efficiency via the use of binary, sparse, activations. In this paper, an SNN model is introduced that combines the benefits of temporally sparse binary activations and of binary weights. Two learning rules are derived, the first based on the combination of straight-through and surrogate gradient techniques, and the second based on a Bayesian paradigm. Experiments validate the performance loss with respect to full-precision implementations, and demonstrate the advantage of the Bayesian paradigm in terms of accuracy and calibration.
KW - Bayesian learning
KW - Binary weights
KW - Calibration
KW - Spiking Neural Networks
UR - http://www.scopus.com/inward/record.url?scp=85115355753&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85115355753&partnerID=8YFLogxK
U2 - 10.1109/DSLW51110.2021.9523415
DO - 10.1109/DSLW51110.2021.9523415
M3 - Conference contribution
AN - SCOPUS:85115355753
T3 - 2021 IEEE Data Science and Learning Workshop, DSLW 2021
BT - 2021 IEEE Data Science and Learning Workshop, DSLW 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 5 June 2021 through 6 June 2021
ER -