COV883: Special Module in Theoretical Computer Science
Topic: Matrix Concentration Inequalities

II semester: 2018-19

Amitabha Bagchi



Next class: 3PM, Wed 13 Mar 2019. SIT 006.

Overview

Concentration inequalities for sums of random variables (Chernoff bounds and others) have played a key role in the analysis of randomized algorithms. Recent developments in the theory of random matrices have shown that similar concentration results can be proved for the spectral norm of the sums of random matrices. Among many other things these results have major consequences in a number of applications that lie at the heart of modern data science including matrix multiplication, subsampling, sparsification. In this course we will go deep into the basic workings of these inequalities and also discuss some applications.

Topics

Probability with matrices, matrix moments and cumulants, the matrix Laplace transform method, Lieb's theorem and the subadditivity of the matrix cumulant generating function, proof sketch of Lieb's theorem, matrix Chernoff bounds and applications, matrix Bernstein's inequality and applications.

Background required: Probability theory, linear algebra.

Texts

Note: This class will closely follow the presentation in [Tropp2015].

Other readings

Lecture record

Assignments

Evaluation

Students will be asked to give 30 min presentations explaining how the basic results are used in various applications.
Amitabha Bagchi