An Introduction to Matrix Concentration Inequalities

@article{Tropp2015AnIT,
  title={An Introduction to Matrix Concentration Inequalities},
  author={Joel A. Tropp},
  journal={ArXiv},
  year={2015},
  volume={abs/1501.01571}
}
  • J. Tropp
  • Published 7 January 2015
  • Mathematics
  • ArXiv
In recent years, random matrices have come to play a major role in computational mathematics, but most of the classical areas of random matrix theory remain the province of experts. Over the last decade, with the advent of matrix concentration inequalities, research has advanced to the point where we can conquer many (formerly) challenging problems with a page or two of arithmetic. The aim of this monograph is to describe the most successful methods from this area along with some interesting… 

Figures from this paper

Universality and sharp matrix concentration inequalities
Abstract. We show that, under mild assumptions, the spectrum of a sum of independent random matrices is close to that of the Gaussian random matrix whose entries have the same mean and covariance.
Matrix Concentration for Products
This paper develops nonasymptotic growth and concentration bounds for a product of independent random matrices. These results sharpen and generalize recent work of Henriksen-Ward, and they are
Structured Random Matrices
TLDR
This chapter reviews a number of recent results, methods, and open problems in this direction, with a particular emphasis on sharp spectral norm inequalities for Gaussian random matrices.
From Poincaré inequalities to nonlinear matrix concentration
This paper deduces exponential matrix concentration from a Poincare inequality via a short, conceptual argument. Among other examples, this theory applies to matrix-valued functions of a uniformly
Norms of Randomized Circulant Matrices
We investigate two-sided bounds for operator norms of random matrices with non-homogenous independent entries. We formulate a lower bound for Rademacher matrices and conjecture that it may be
Hyperbolic Concentration, Anti-concentration, and Discrepancy
TLDR
This paper presents the first work that shows either concentration or anti-concentration results for hyperbolic polynomials and implies that a random bi-coloring of any set system with n sets and n elements will have discrepancy O ( √ n log n ) with high probability.
On Dimension-free Tail Inequalities for Sums of Random Matrices and Applications
TLDR
In compressed sensing, the resulted tail inequalities are employed to achieve a proof of the restricted isometry property when the measurement matrix is the sum of random matrices without any assumption on the distributions of matrix entries, and a new upper bound to the supreme of stochastic processes is derived.
Norms of random matrices: local and global problems
...
...

References

SHOWING 1-10 OF 228 REFERENCES
Matrix concentration inequalities via the method of exchangeable pairs
This paper derives exponential concentration inequalities and polynomial moment inequalities for the spectral norm of a random matrix. The analysis requires a matrix extension of the scalar
Second-Order Matrix Concentration Inequalities
The random paving property for uniformly bounded matrices
This note presents a new proof of an important result due to Bourgain and Tzafriri that provides a partial solution to the Kadison-Singer problem. The result shows that every unit-norm matrix whose
Concentration Inequalities
TLDR
This text attempts to summarize some of the basic tools used in establishing concentration inequalities, which are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms.
Concentration Inequalities - A Nonasymptotic Theory of Independence
TLDR
Deep connections with isoperimetric problems are revealed whilst special attention is paid to applications to the supremum of empirical processes.
Sketching as a Tool for Numerical Linear Algebra
TLDR
This survey highlights the recent advances in algorithms for numericallinear algebra that have come from the technique of linear sketching, and considers least squares as well as robust regression problems, low rank approximation, and graph sparsification.
A survey of certain trace inequalities
This paper concerns inequalities like TrA ≤ TrB, where A and B are certain Hermitian complex matrices and Tr stands for the trace. In most cases A and B will be exponential or logarithmic expressions
Sums of random Hermitian matrices and an inequality by Rudelson
We give a new, elementary proof of a key inequality used by Rudelson in the derivation of his well-known bound for random sums of rank-one operators. Our approach is based on Ahlswede and Winter's
User-Friendly Tail Bounds for Sums of Random Matrices
  • J. Tropp
  • Mathematics
    Found. Comput. Math.
  • 2012
TLDR
This paper presents new probability inequalities for sums of independent, random, self-adjoint matrices and provides noncommutative generalizations of the classical bounds associated with the names Azuma, Bennett, Bernstein, Chernoff, Hoeffding, and McDiarmid.
...
...