Energy-Efficient Recurrent Neural Network With MRAM-Based Probabilistic Activation Functions

Shadi Sheikhfaal, Shaahin Angizi, Ronald F. Demara

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


Herein, we develop a programmable energy-efficient hardware implementation for Recurrent Neural Networks (RNNs) with Resistive Random-Access Memory (ReRAM) synapses and ultra-low power, area-efficient spin-based activation functions. To attain high energy-efficiency while maintaining accuracy, a novel Computing-in-Memory (CiM) architecture is proposed to leverage data-level parallelism during the evaluation phase. We employ an MRAM-based Adjustable Probabilistic Activation Function (APAF) via a low-power tunable activation mechanism, providing adjustable levels of accuracy to mimic ideal sigmoid and tanh thresholding along with a matching algorithm to regulate the neuron properties. Our hardware/software cross-layer simulation shows that our proposed design achieves up to 74.5× energy-efficiency with ∼11× area reduction compared to its counterpart designs while keeping the accuracy comparable.

Original languageEnglish (US)
Pages (from-to)534-540
Number of pages7
JournalIEEE Transactions on Emerging Topics in Computing
Issue number2
StatePublished - Apr 1 2023

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications


  • Binary stochastic neuron
  • computing-in-memory
  • recurrent neural networks
  • spintronics


Dive into the research topics of 'Energy-Efficient Recurrent Neural Network With MRAM-Based Probabilistic Activation Functions'. Together they form a unique fingerprint.

Cite this