TY - GEN
T1 - Phase-change memory models for deep learning training and inference
AU - Nandakumar, S. R.
AU - Boybat, Irem
AU - Joshi, Vinay
AU - Piveteau, Christophe
AU - Le Gallo, Manuel
AU - Rajendran, Bipin
AU - Sebastian, Abu
AU - Eleftheriou, Evangelos
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - Non-volatile analog memory devices such as phase-change memory (PCM) enable designing dedicated connectivity matrices for the hardware implementation of deep neural networks (DNN). In this in-memory computing approach, the analog conductance states of the memory device can be gradually updated to train DNNs on-chip or software trained connection strengths may be programmed one-time to the devices to create efficient inference engines. Reliable and computationally simple models that capture the non-ideal programming and temporal evolution of the devices are needed for evaluating the training and inference performance of the deep learning hardware based on in-memory computing. In this paper, we present statistically accurate models for PCM, based on the characterization of more than 10, 000 devices, that capture the state-dependent nature and variability of the conductance update, conductance drift, and read noise. Integrating the computationally simple device models with deep learning frameworks such as TensorFlow enables us to realistically evaluate training and inference performance of the PCM array based hardware implementations of DNNs.
AB - Non-volatile analog memory devices such as phase-change memory (PCM) enable designing dedicated connectivity matrices for the hardware implementation of deep neural networks (DNN). In this in-memory computing approach, the analog conductance states of the memory device can be gradually updated to train DNNs on-chip or software trained connection strengths may be programmed one-time to the devices to create efficient inference engines. Reliable and computationally simple models that capture the non-ideal programming and temporal evolution of the devices are needed for evaluating the training and inference performance of the deep learning hardware based on in-memory computing. In this paper, we present statistically accurate models for PCM, based on the characterization of more than 10, 000 devices, that capture the state-dependent nature and variability of the conductance update, conductance drift, and read noise. Integrating the computationally simple device models with deep learning frameworks such as TensorFlow enables us to realistically evaluate training and inference performance of the PCM array based hardware implementations of DNNs.
KW - Deep learning
KW - Inference
KW - Phase-change memory
KW - Statistical model
KW - Training
UR - http://www.scopus.com/inward/record.url?scp=85075017281&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85075017281&partnerID=8YFLogxK
U2 - 10.1109/ICECS46596.2019.8964852
DO - 10.1109/ICECS46596.2019.8964852
M3 - Conference contribution
AN - SCOPUS:85075017281
T3 - 2019 26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019
SP - 727
EP - 730
BT - 2019 26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 26th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2019
Y2 - 27 November 2019 through 29 November 2019
ER -