User-Friendly Tail Bounds for Sums of Random Matrices

  title={User-Friendly Tail Bounds for Sums of Random Matrices},
  author={Joel A. Tropp},
  journal={Foundations of Computational Mathematics},
  • J. Tropp
  • Published 25 April 2010
  • Mathematics
  • Foundations of Computational Mathematics
This paper presents new probability inequalities for sums of independent, random, self-adjoint matrices. These results place simple and easily verifiable hypotheses on the summands, and they deliver strong conclusions about the large-deviation behavior of the maximum eigenvalue of the sum. Tail bounds for the norm of a sum of random rectangular matrices follow as an immediate corollary. The proof techniques also yield some information about matrix-valued martingales.In other words, this paper… 
User-Friendly Tail Bounds for Matrix Martingales
This report presents probability inequalities for sums of adapted sequences of random, self-adjoint matrices. The results frame simple, easily verifiable hypotheses on the summands, and they yield
Convenient Tail Bounds for Sums of Random Tensors
  • S. Chang
  • Mathematics
    Taiwanese Journal of Mathematics
  • 2021
. This work prepares new probability bounds for sums of random, independent, Hermitian tensors. These probability bounds characterize large-deviation behavior of the extreme eigenvalue of the sums of
On Dimension-free Tail Inequalities for Sums of Random Matrices and Applications
In compressed sensing, the resulted tail inequalities are employed to achieve a proof of the restricted isometry property when the measurement matrix is the sum of random matrices without any assumption on the distributions of matrix entries, and a new upper bound to the supreme of stochastic processes is derived.
Dimension-free tail inequalities for sums of random matrices
We derive exponential tail inequalities for sums of random matrices with no dependence on the explicit matrix dimensions. These are similar to the matrix versions of the Chernoff bound and Bernstein
This bound is optimal in the sense that a matching lower bound holds under mild assumptions, and the constants are sufficiently sharp that we can often capture the precise edge of the spectrum.
Dimension-free Bounds for Sums of Independent Matrices and Simple Tensors via the Variational Principle
This work considers the deviation inequalities for the sums of independent d by d random matrices, as well as rank one random tensors, and presents the bounds that do not depend explicitly on the dimension d, but rather on the e-ective rank.
Advanced Tools from Probability Theory
This chapter is dedicated to advanced probability results required in the more involved arguments on random measurement matrices, notably precise bounds for Gaussian random matrices and the analysis
Matrix concentration inequalities via the method of exchangeable pairs
This paper derives exponential concentration inequalities and polynomial moment inequalities for the spectral norm of a random matrix. The analysis requires a matrix extension of the scalar
Matrix Infinitely Divisible Series: Tail Inequalities and Applications in Optimization
The results encompass the existing work on matrix Gaussian series as a special case and show that the tail inequalities of a matrix i.d. series have applications in several optimization problems including the chance constrained optimization problem and the quadratic optimization problem with orthogonality constraints.
A matrix expander Chernoff bound
A Chernoff-type bound for sums of matrix-valued random variables sampled via a random walk on an expander is proved, confirming a conjecture due to [Wigderson and Xiao 06], and a generic reduction is provided showing that any concentration inequality for vector-valued martingales implies a concentration equality for the corresponding expander walk.


User-Friendly Tail Bounds for Matrix Martingales
This report presents probability inequalities for sums of adapted sequences of random, self-adjoint matrices. The results frame simple, easily verifiable hypotheses on the summands, and they yield
Sums of random Hermitian matrices and an inequality by Rudelson
We give a new, elementary proof of a key inequality used by Rudelson in the derivation of his well-known bound for random sums of rank-one operators. Our approach is based on Ahlswede and Winter's
Noncommutative Burkholder/Rosenthal inequalities II: Applications
We show norm estimates for the sum of independent random variables in noncommutative Lp-spaces for 1 < p < ∞, following our previous work. These estimates generalize the classical Rosenthal
Freedman's inequality is a martingale counterpart to Bernstein's inequality. This result shows that the large-deviation behavior of a martingale is controlled by the predictable quadratic variation
A survey of certain trace inequalities
This paper concerns inequalities like TrA ≤ TrB, where A and B are certain Hermitian complex matrices and Tr stands for the trace. In most cases A and B will be exponential or logarithmic expressions
Positive Definite Matrices
  • R. Bhatia
  • Mathematics
    Operator-Adapted Wavelets, Fast Solvers, and Numerical Homogenization
  • 2019
This book represents the first synthesis of the considerable body of new research into positive definite matrices. These matrices play the same role in noncommutative analysis as positive real
A Simpler Approach to Matrix Completion
  • B. Recht
  • Computer Science
    J. Mach. Learn. Res.
  • 2011
This paper provides the best bounds to date on the number of randomly sampled entries required to reconstruct an unknown low-rank matrix by minimizing the nuclear norm of the hidden matrix subject to agreement with the provided entries.
Non-asymptotic theory of random matrices: extreme singular values
This survey addresses the non-asymptotic theory of extreme singular values of random matrices with independent entries and focuses on recently developed geometric methods for estimating the hard edge ofrandom matrices (the smallest singular value).
Functions of Matrices: Theory and Computation (Other Titles in Applied Mathematics)
The only book devoted exclusively to matrix functions, this research monograph gives a thorough treatment of the theory of matrix functions and numerical methods for computing them. The author s
The Expected Norm of Random Matrices
  • Y. Seginer
  • Mathematics
    Combinatorics, Probability and Computing
  • 2000
The Euclidean operator norm of a random matrix, A, obtained from a (non-random) matrix by randomizing the signs of the matrix's entries is considered, which is the best inequality possible (up to a multiplicative constant).