Dimension-free tail inequalities for sums of random matrices

  title={Dimension-free tail inequalities for sums of random matrices},
  author={Daniel J. Hsu and Sham M. Kakade and Tong Zhang},
We derive exponential tail inequalities for sums of random matrices with no dependence on the explicit matrix dimensions. These are similar to the matrix versions of the Chernoff bound and Bernstein inequality except with the explicit matrix dimensions replaced by a trace quantity that can be small even when the dimension is large or infinite. Some applications to principal component analysis and approximate matrix multiplication are given to illustrate the utility of the new bounds. 


Publications referenced by this paper.
Showing 1-10 of 11 references

Recovering Low-Rank Matrices From Few Coefficients in Any Basis

IEEE Transactions on Information Theory • 2011
View 5 Excerpts
Highly Influenced

A Simpler Approach to Matrix Completion

Journal of Machine Learning Research • 2011
View 2 Excerpts

The volume of convex bodies and Banach space geometry

G. Pisier

A Sums of random vector outer products The following lemma is a tail inequality for smallest and largest eigenvalues of the empirical covariance matrix of subgaussian random vectors

Advances in Neural Information Processing Systems • 2006

Inequality with applications in statistical mechanics

B. Schölkopf
Advances in Kernel Methods — Support Vector Learning • 1999

Gaussian processes and almost spherical sections of convex bodies

Y.-K. Liu D. Gross, S. Flammia, S. Becker, J. Eisert
Annals of Probability • 1988

On tail probabilities for martingales

A. Gretton
The Annals of Probability • 1975

Convex trace functions and the Wigner - Yanase - Dyson conjecture

A. Pajor A. Litvak, M. Rudelson, N. Tomczak-Jaegermann
Adv . Math . • 1973

Similar Papers

Loading similar papers…