Stochastic Spiking Attention: Accelerating Attention with Stochastic Computing in Spiking Networks

Zihang Song, Prabodh Katti, Osvaldo Simeone, Bipin Rajendran

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Spiking Neural Networks (SNNs) have been recently integrated into Transformer architectures due to their potential to reduce computational demands and to improve power efficiency. Yet, the implementation of the attention mechanism using spiking signals on general-purpose computing platforms remains ineffi-cient. In this paper, we propose a novel framework leveraging stochastic computing (SC) to effectively execute the dot-product attention for SNN-based Transformers. We demonstrate that our approach can achieve high classification accuracy (83.53%) on CIFAR-10 within 10 time steps, which is comparable to the performance of a baseline artificial neural network implementation (83.66%). We estimate that the proposed SC approach can lead to over 6.3× reduction in computing energy and 1.7× reduction in memory access costs for a digital CMOS-based ASIC design. We experimentally validate our stochastic attention block design through an FPGA implementation, which is shown to achieve 48× lower latency as compared to a GPU implementation, while consuming 15× less power.

Original languageEnglish (US)
Title of host publication2024 IEEE 6th International Conference on AI Circuits and Systems, AICAS 2024 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages31-35
Number of pages5
ISBN (Electronic)9798350383638
DOIs
StatePublished - 2024
Event6th IEEE International Conference on AI Circuits and Systems, AICAS 2024 - Abu Dhabi, United Arab Emirates
Duration: Apr 22 2024Apr 25 2024

Publication series

Name2024 IEEE 6th International Conference on AI Circuits and Systems, AICAS 2024 - Proceedings

Conference

Conference6th IEEE International Conference on AI Circuits and Systems, AICAS 2024
Country/TerritoryUnited Arab Emirates
CityAbu Dhabi
Period4/22/244/25/24

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Electrical and Electronic Engineering
  • Instrumentation

Keywords

  • attention
  • hardware accelerator
  • Spiking neural network
  • stochastic computing
  • Transformer

Fingerprint

Dive into the research topics of 'Stochastic Spiking Attention: Accelerating Attention with Stochastic Computing in Spiking Networks'. Together they form a unique fingerprint.

Cite this