PhD Defense: "Hardware Aware and Architecture Friendly Training of Memristive Crossbar Circuits as Neural Network Pattern Classifiers"

Elham Zamanidoost

November 21st (Monday), 2:00pm
Harold Frank Hall (HFH), Rm 4164

The field of artificial neural networks is experiencing a resurge of interest due to increase in demand for intelligence systems, which can process data with minimal human interference. In the information era, classification, recognition, and clustering data are becoming everyday tasks which should be performed accurately and efficiently. High-performance computing systems, although accurate, are not very efficient in implementing neural network systems and are limited by speed and power challenges related to Van Neumann architecture. Development of specialized hardware presents itself as a solution to this challenge and opens the door to a field of research, which has been explored by utilization of conventional as well as emerging technologies. Among new technologies, which are candidates for exploitation in neural network implementation, memristor crossbars have been in the spotlight in the recent years due to their numerous attractive characteristics. Memristor, as a two terminal nano-device, is highly scalable and can be fabricated in 3-dimensional arrays to achieve high density of connections, which is required for powerful neural networks, such as convolutional nets. Also, its nonlinear I-V characteristics along with its non-volatile properties make it perfect for synapse implementation.

Therefore, in this talk, our goal is to consider memristive crossbar circuits as a synaptic layer in a feedforward neural network and study and develop efficient training algorithm for such network implementation. Our focus is mainly on transistor-less crossbar implementation of weight layers, which results in the highest density of connections per unit area.

We have explored ex-situ training, in-situ training, and hybrid training methods for multilayer neural networks in the presence of accurate memristor dynamic model to show the challenges of using such device as synapse and to offer solutions for efficient tuning of all (or several) devices in an array in parallel. In order to show the resilience of our training approach, we have considered realistic phenomena involved with emerging technologies, such as switching variation from device to device, stuck at open (and close) defects within the memristive crossbar, and limitations of tuning devices in large arrays. Our training algorithms are tested against standard and well-known benchmarks to compare the effectiveness and performance of our trained networks against other state of the art methods as well as software implemented training.

About Elham Zamanidoost:

Photo of Zamanidoost Elham Zamanidoost has received her Bachelor’s degree in electronics from Shahid Beheshti University, Tehran, Iran, in 2007 and her Master’s degree in computer engineering from UCSB at 2013. She joined Strukov’s lab in 2011 to study the possibilities and challenges of memristive crossbar circuits as neural network pattern classifiers.