TY - JOUR
T1 - Accurate deep neural network inference using computational phase-change memory
AU - Joshi, Vinay
AU - Le Gallo, Manuel
AU - Haefeli, Simon
AU - Boybat, Irem
AU - Nandakumar, S. R.
AU - Piveteau, Christophe
AU - Dazzi, Martino
AU - Rajendran, Bipin
AU - Sebastian, Abu
AU - Eleftheriou, Evangelos
N1 - Publisher Copyright:
© 2020, The Author(s).
PY - 2020/12/1
Y1 - 2020/12/1
N2 - In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to phase-change memory (PCM) devices. We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on CIFAR-10 and a top-1 accuracy of 71.6% on ImageNet benchmarks after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in a differential configuration.
AB - In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to phase-change memory (PCM) devices. We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on CIFAR-10 and a top-1 accuracy of 71.6% on ImageNet benchmarks after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in a differential configuration.
UR - http://www.scopus.com/inward/record.url?scp=85084785893&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084785893&partnerID=8YFLogxK
U2 - 10.1038/s41467-020-16108-9
DO - 10.1038/s41467-020-16108-9
M3 - Article
C2 - 32424184
AN - SCOPUS:85084785893
SN - 2041-1723
VL - 11
JO - Nature communications
JF - Nature communications
IS - 1
M1 - 2473
ER -