Abstract
Spiking neural networks (SNNs) are artificial learning models that closely mimic the time-based information encoding and processing mechanisms observed in the brain. As opposed to deep learning models that use real numbers for information encoding, SNNs use binary spike signals and their arrival times to encode information, which could potentially improve the algorithmic efficiency of computation. However overall system efficiency improvement for learning and inference systems implementing SNNs will depend on the ability to reduce data movement between processor and memory units, and hence in-memory computing architectures employing nanoscale memristive devices that operate at low power would be essential. The requirements and specifications for these devices for realizing SNNs are quite different from those of regular deep learning models. In this chapter we introduce some of the fundamental aspects of spike-based information processing and how nanoscale memristive devices could be used to efficiently implement these algorithms for cognitive applications.
Original language | English (US) |
---|---|
Title of host publication | Memristive Devices for Brain-Inspired Computing |
Subtitle of host publication | From Materials, Devices, and Circuits to Applications - Computational Memory, Deep Learning, and Spiking Neural Networks |
Publisher | Elsevier |
Pages | 399-405 |
Number of pages | 7 |
ISBN (Electronic) | 9780081027820 |
DOIs | |
State | Published - Jan 1 2020 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- General Engineering
Keywords
- In-memory computing
- Memristor
- Spiking neural network