Aug 4 (Wed): "High-Performance Training Algorithms and Architectural Optimization of Spiking Neural Networks," Wenrui Zhang, ECE PhD Defense

Date and Time
Location
Zoom Meeting – Meeting ID: 992 7738 1398 | Passcode: 063572

https://ucsb.zoom.us/j/99277381398?pwd=N3VQS2lvUEMySUlQNVN4aFB3bktlZz09

Abstract:

The spiking neural network (SNN) is an emerging brain-inspired computing paradigm with the more biologically realistic spiking neuron model. As the third generation of artificial neural networks (ANNs), SNNs are theoretically shown to possess greater computational power than the conventional non-spiking ANNs and are well suited for spatio-temporal information processing and implementation on ultra-low power event-driven neuromorphic hardware. This dissertation aims to usher SNNs into the mainstream practice by addressing two key roadblocks: lack of high-performance training algorithms and lack of systematic exploration of computationally-powerful recurrent SNNs.

First, existing SNNs training algorithms suffer from major limitations in terms of learning performance and efficiency. To handle these challenges, we proposed a comprehensive set of solutions including synaptic plasticity (SP) and intrinsic plasticity (IP) to embrace energy-efficient SNNs with high performance. To enable SP based training algorithms, we developed two innovative backpropagation (BP) methods to boost the performance of SNNs. We proposed a Spike-Train level RSNNs Backpropagation (ST-RSBP) algorithm for training deep recurrent SNNs (RSNNs) while addressing the training difficulty introduced by non-differentiability of spiking activation function and improving training efficiency at the spike-train level. To allow for learning temporal sequences with precise timing, we propose a BP method called Temporal Spike Sequence Learning Backpropagation (TSSL-BP), breaking down error backpropagation across two types of inter/intra-neuron dependencies and precisely capturing the temporal dependencies with ultra-low latency. To train a given SNN using IP, we proposed a method called SpiKL-IP based on a rigorous information-theoretic approach for maintaining homeostasis and shaping the dynamics of neural circuits.

While recurrence is prevalent in the brain, designing practical recurrent spiking neural networks (RSNNs) is challenging due to the intracity introduced by recurrent connections both in time and space. RSNNs are often randomly generated without optimization in the current practice, which however fails to fully exploit the computational potential of RSNNs. We explored and proposed a family of RSNN architectures aiming at building scalable large-scale RSNNs with high performance We first demonstrated a new type of RSNNs called Skip-Connected Self-Recurrent SNN (ScSr-SNN) which contain self-recurrent connections in each recurrent layer and skip connections across non-adjacent layers. It achieves improved performance over existing randomly generated RSNNs. Inspired by the potential of self-recurrent connectivity, we proposed another novel structure called the Laterally-Inhibited Self-Recurrent Unit (LISR), which consists of one excitatory neuron with a self-recurrent connection wired together with an inhibitory neuron through excitatory and inhibitory synapses. SNNs leveraging the LISR as a basic building block significantly improve performance over feedforward SNNs trained by the BP method with similar computational costs. Finally, we developed a systematic optimization-based neural architecture search framework to synthesize high-performance globally-feedforward and locally-recurrent multi-layer RSNNs.

The proposed work achieves the state-of-the-art performances on various image and speech datasets such as MNIST, FashionMNIST, CIFAR10, TI46 and common neuromorphic datasets including NMNIST, NTIDIGITS, DVS-Gesture.

Bio:

Wenrui Zhang received the B.S. degree in electronic information science and technology from the University of Science and Technology of China, Hefei, China, in 2015. He is currently a Ph.D. candidate in computer engineering from University of California at Santa Barbara, Santa Barbara. His research focuses on spiking neural networks learning algorithms and architectures, brain-inspired computing, and computational brain modeling.

Hosted by: Professor Peng Li

Submitted by: Wenrui Zhang <wenruizhang@ucsb.edu>