Abstract
In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to phase-change memory (PCM) devices. We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on CIFAR-10 and a top-1 accuracy of 71.6% on ImageNet benchmarks after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in a differential configuration.
| Original language | English (US) |
|---|---|
| Article number | 2473 |
| Journal | Nature communications |
| Volume | 11 |
| Issue number | 1 |
| DOIs | |
| State | Published - Dec 1 2020 |
| Externally published | Yes |
All Science Journal Classification (ASJC) codes
- General Chemistry
- General Biochemistry, Genetics and Molecular Biology
- General
- General Physics and Astronomy
Fingerprint
Dive into the research topics of 'Accurate deep neural network inference using computational phase-change memory'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver