Dr. Tsachy Weissman, Professor of Electrical Engineering, Stanford University
The talk will cover both classical and recent results with two intertwined themes. The first is that the characterization of fundamental limits in data compression and communication gives rise to quantities – such as entropy, mutual information, and directed information – that are of relevance to statistical inference at large. The second is that analytic tools and algorithmic know-how from compression can be harnessed to construct and analyze schemes for general tasks such as sequential decision making, non-sequential inference (e.g. denoising), and estimation of the degree to which one phenomenon is of relevance in prediction of another. A few examples will illustrate how some of these ideas are manifested in inference with real data such as financial time series, genomic sequences, and text.
About Dr. Tsachy Weissman:
Tsachy Weissman graduated summa cum laude with a B.Sc. in electrical engineering from the Technion in 1997, and earned his Ph.D. at the same place in 2001. He then worked at Hewlett Packard Laboratories with the information theory group until 2003, when he joined Stanford University, where he is currently Professor of Electrical Engineering and incumbent of the STMicroelectronics chair in the School of Engineering. He has spent leaves at the Technion, and at ETH Zurich. Tsachy's research is focused on information theory, compression, communication, statistical signal processing, the interplay between them, and their applications. Fellow of the IEEE, he is recipient of several best paper awards, and prizes for excellence in research and teaching. He served on the editorial board of the IEEE Transactions on Information Theory from Sept. 2010 to Aug. 2013, and currently serves on the editorial board of Foundations and Trends in Communications and Information Theory. He is Founding Director of the Stanford Compression Forum.
Hosted by: Professor Jerry Gibson