Files

Download

Download Full Text (527 KB)

Description

This research develops on-chip training circuits for memristor based deep neural networks utilizing unsupervised and supervised learning methods. As the training and recognition of deep networks are computationally intensive, specialized circuits capable of on-chip training of these networks could have significant advantages in speed and power consumption. Memristor crossbar circuits allow neural algorithms to be implemented very efficiently, but could be prone to device variations and faults. On chip training circuits would allow the training algorithm to account for device variability and faults in these circuits. We have utilized autoencoders for layer-wise pre-training of the deep networks and utilized the back-propagation algorithm for supervised fine tuning. Our design utilizes two memristors per synapse for higher precision of weights. Techniques to reduce the impact of sneak-paths in a large memristor crossbar and for high speed simulations of large crossbars were proposed. We performed detailed evaluation of the training circuits with some nonlinearly separable datasets which take crossbar wire resistance and sneak-paths into consideration. We also demonstrated successful training of memristor based deep networks for the MNIST digit classification and the KDD intrusion detection datasets. This work would enable the design of high throughput, energy efficient, and compact deep learning systems.

Publication Date

4-9-2016

Project Designation

Graduate Research

Primary Advisor

Tarek M Taha

Primary Advisor's Department

Electrical & Computer Eng

Keywords

Stander Symposium poster

On-chip Training of Memristor based Deep Neural Networks

Share

COinS