University of California, Santa Barbara
Department of Electrical and Computer Engineering


Advanced Signal Processing for Neuroscience

ECE 594C Fall 2010

Instructor: Upamanyu Madhow
madhow@ece.ucsb.edu

Schedule: Mondays and Wednesdays, 6-7:50 pm, Phelps 1437


Course Information

Schedule and Topics

Course Resources

Problem Sets

Projects

Course Information:

Goal:
This is a research-oriented graduate course targeted at bringing ECE and Psychology students together to collaborate on exploring the application of advanced signal processing techniques to neuroimaging.

Text: None. We will use readings from the literature.

Course Mechanics: Students will be asked to form teams (each with at least one Psychology and one ECE student) after the first week of class. All of the work in the course will be done in these teams. Since the goal is to tackle open-ended questions, grading will be on the basis of participation and effort (documented through written reports and oral presentations) rather than results.

Tentative Schedule and list of topics:

Tentative Schedule:

Tentative list of topics:

Lecture 1: Introduction (course goals, structure, high-level introduction to EEG and fMRI), signal processing model for EEG
Lectures 2-4: Description of ICA theory and algorithms
Homework: get ICA code from the web or write your own, apply to artificial data specified by instructor
Guest Lecture 5: Handling EEG data, how to get access to data for one or two experiments

Assign Project 1 (same for all teams): ICA analysis for EEG dataset (same for all groups). Identifying eye artifacts automatically, identifying important spatial components, using coarse head model to localize the regions that they correspond to

Lecture 6: fMRI data acquisition overview
Lectures 7-8: Current approaches for modeling fMRI data
Guest Lecture 9: Guest lecture on handling fMRI data, how to get access to preprocessed (but not spatially averaged) fMRI data
Lecture 10: Basic concepts of detection and estimation (matched filtering, correlation, maximum likelihood), possible new approaches to modeling fMRI data
Lectures 11-12: Student presentations for Project 1

Decide on Project 2: Choose your own topic (oral presentation including numerical results that you generate, with real or artificial data). Possible topics:
--EEG
--fMRI modeling
--Brain-computer interfaces
--Neural network simulation
--Brain-based devices

Lectures 13-14: fMRI acquisition
Lectures 15-16: Compressed sensing
Homework: Write or download code for compressed sensing and apply to artificial data supplied by instructor
Lecture 17: Possible application of compressed sensing to speeding up fMRI acquisition
Lectures 18-20: Student presentations for Project 2

--Principal Component Analysis (PCA)
--Independent Component Analysis (ICA)
--Application of PCA and ICA to EEG data
--Detection and estimation techniques: maximum likelihood, Bayesian
-- Modeling of functional fMRI signals
--Space-time processing of fMRI data
--Compressed sensing: theory and algorithms
--Application of compressed sensing to speed up fMRI
--(if time permits) Neural network modeling and dynamics

Course Resources:

Independent Component Analysis:

Notes on the Infomax Algorithm

+ Algorithms

Fast ICA Algorithm

The deflation-based fast ICA algorithm was first presented in the paper:
[fast_ica_hyvarinen99] A. Hyvarinen, “Fast and robust fixed-point algorithms for independent component analysis,” IEEE Trans. Neural Networks, vol. 10, no. 3, May 1999.
The symmetric fast ICA algorithm in this paper is not the one that is actually in the downloadable code. Both the deflation-based and the symmetric fast ICA algorithm as currently implemented are reviewed in the paper:
[fastica_analysis_06] P. Tichavsky, Z. Koldovsky, E. Oja, “Performance analysis of the FastICA algorithm and Cramer-Rao bounds for linear independent component analysis,” IEEE Trans. Signal Processing, vol. 54, no. 4, April 2006.

Fast ICA code is downloadable from http://www.cis.hut.fi/projects/ica/fastica

Infomax Algorithm

The extended Infomax algorithm, which is what is implemented in the downloadable code, is presented in:
[infomax_sejnowski] T.-W. Lee, M. Girolami, T. J. Sejnowski, “Independent component analysis using an extended Infomax algorithm for mixed sub-Gaussian and super-Gaussian sources,” Neural Computation, vol. 11, no. 2, 1999.

The code is downloadable as part of EEGLab from http://sccn.ucsd.edu/eeglab/

Other algorithms of interest

JADE Algorithm
Cardoso’s cumulant-based JADE algorithm operates on the entire batch of data at once. See this page for a general guide to his work:
http://perso.telecom-paristech.fr/~cardoso/guidesepsou.html
and use this link for matlab code for the JADE algorithm for real-valued data (which is what we are interested in for the EEG application):
http://perso.telecom-paristech.fr/~cardoso/Algo/Jade/jadeR.m

Robust ICA Algorithm

Zarzoso and Comon recently proposed a “robust ICA” algorithm which cites prior work saying that imperfect sphering (because of errors in estimation of the covariance matrix from finite data) can impact the performance of fast ICA. This paper is interesting because it is recent (2010), and hence has a perspective on the field:
[comon_2010] V. Zarzoso, P. Comon, “Robust independent component analysis by iterative maximization of the kurtosis contrast with algebraic optimal step size,” IEEE Trans. Neural Networks, vol. 21, no. 2, February 2010.

The code is downloadble from:
http://www.i3s.unice.fr/~zarzoso/robustica.html

RADICAL Algorithm

This algorithm tries to minimizes entropy directly, estimating it using order statistics. The resource page (including downloadable code is at:
http://www.cs.umass.edu/~elm/ICA/
The algorithm is described in:
[radical_03] E. G. Learned-Miller, J. W. Fisher, “ICA using spacings estimate of entropy,” J. Machine Learning Research, vol. 4, pp. 1271-1295, 2003.

+ Theory

Delving into ICA theory is not required for the homeworks and projects in this course.  However, I have put together a brief guide for those of you who might be interested.  There are many many other papers that I can also point you to if this list is not enough to satisfy your curiosity.

Perhaps the best starting point for understanding the theory behind ICA is Comon’s 1994 paper:
[comon94] P. Comon, “Independent component analysis: a new concept?,” Signal Processing, pp. 287-314, 1994.

The key ideas behind the fast ICA algorithm are presented in the following two papers:
[fast_ica_hyvarinen99] A. Hyvarinen, “Fast and robust fixed-point algorithms for independent component analysis,” IEEE Trans. Neural Networks, vol. 10, no. 3, May 1999.
[hyvarinen_oja_98] A. Hyvarinen, E. Oja, “Independent component analysis by general nonlinear Hebbian-like learning rules,” Signal Processing, pp. 301-313, 1998.
A more recent performance analysis is provided in:
[fastica_analysis_06]P. Tichavsky, Z. Koldovsky, E. Oja, “Performance analysis of the FastICA algorithm and Cramér–Rao bounds for linear independent component analysis,” IEEE Trans. Signal Processing, vol. 54, no. 4, April 2006.

As we discuss in class, the extended Infomax algorithm uses the concept of relative gradient, which is described in the paper by Cardoso and Laheld (this concept is essentially the same as Amari’s concept of natural gradient for our context, but requires less mathematical sophistication to understand):
[cardoso_laheld_96] J.-F. Cardoso, B. H. Laheld, “Equivariant adaptive source separation,” IEEE Trans. Signal Processing, vol. 44, no. 12, December 1996.

The following reference says that sphering with an imperfect estimate of the covariance matrix may lead to performance problems in fast ICA:
[fast_ica_finite_sample_07] S. Bermejo, “Finite sample effects of the fast ICA algorithm,” Neurocomputing, pp. 392-399, March 2007.
This observation appears to be one of the motivations behind the robust ICA algorithm proposed by Zarzoso and Comon.

Demonstrations of spurious solutions to ICA with infomax or the non-cumulant based fast ICA nonlinearities are provided in several papers:
[ica_spurious_2010] F. Ge, J. Ma, Spurious solution of the maximum likelihood approach to ICA,” IEEE Signal Processing Letters, vol. 17, no. 7, July 2010. (this implies, for example, that the Infomax algorithm may be vulnerable to spurious minima).
[ica_spurious_vrins05] F. Vrins, M. Verleysen, “Information theoretic versus cumulant-based contrasts for multimodal source separation,” IEEE Signal Processing Letters, vol. 12, no. 3, March 2005. (this implies, for example, that cumulant-based contrast functions such as the fourth moment in fast ICA or JADE may be less vulnerable to local minima than some of the other nonlinearities used in the fast ICA and Infomax algorithms).
A longer follow-up paper is:
[ica_spurious_vrins07]F. Vrins, D.-T. Pham, M. Verleysen “Mixing and Non-Mixing Local Minima of the Entropy Contrast for Blind Source Separation,” IEEE Trans. Information Theory, vol. 53, no. 3, March 2007.

Problem Sets/Labs:

Problem Set/Lab 1 (due October 20 in class)

Lab 2: fMRI modeling (due November 1)
Addendum to Lab 2 (Hypothesis Testing for fMRI?)
Assigned: November 4
Due: TBD

Lab 3: EEG Modeling (report due November 10)
Reference: [Scherg90] M. Scherg, ``Fundamental of dipole source potential analysis,'' {\it Advances in Audiology Reprint,}
Auditory Evoked Magnetic Fields and Electric Potentials, vol. 6, pp. 40-69, 1990.

 


ECE Syllabi || Electrical and Computer Engineering || College of Engineering || UCSB Web Site Directory

Last Updated: November 5, 2010