Share This Author
Introduction to the non-asymptotic analysis of random matrices
- R. Vershynin
- Computer ScienceCompressed Sensing
- 12 November 2010
This is a tutorial on some basic non-asymptotic methods and concepts in random matrix theory, particularly for the problem of estimating covariance matrices in statistics and for validating probabilistic constructions of measurementMatrices in compressed sensing.
- R. Vershynin
- Computer Science
- 27 September 2018
A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
A Randomized Kaczmarz Algorithm with Exponential Convergence
A randomized version of the Kaczmarz method for consistent, overdetermined linear systems and it is proved that it converges with expected exponential rate and is the first solver whose rate does not depend on the number of equations in the system.
Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
This paper finds a simple regularized version of Orthogonal Matching Pursuit (ROMP) which has advantages of both approaches: the speed and transparency of OMP and the strong uniform guarantees of L1-minimization.
On sparse reconstruction from Fourier and Gaussian measurements
This paper improves upon best‐known guarantees for exact reconstruction of a sparse signal f from a small universal sample of Fourier measurements by showing that there exists a set of frequencies Ω such that one can exactly reconstruct every r‐sparse signal f of length n from its frequencies in Ω, using the convex relaxation.
The Littlewood-Offord problem and invertibility of random matrices
Hanson-Wright inequality and sub-gaussian concentration
In this expository note, we give a modern proof of Hanson-Wright inequality for quadratic forms in sub-gaussian random variables.We deduce a useful concentration inequality for sub-gaussian random…
Signal Recovery From Incomplete and Inaccurate Measurements Via Regularized Orthogonal Matching Pursuit
The algorithm, Regularized Orthogonal Matching Pursuit (ROMP), seeks to provide the benefits of the two major approaches to sparse recovery, and combines the speed and ease of implementation of the greedy methods with the strong guarantees of the convex programming methods.
Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
It is shown that an -sparse signal in can be accurately estimated from m = O(s log(n/s) single-bit measurements using a simple convex program, and the same conveX program works for virtually all generalized linear models, in which the link function may be unknown.
Smallest singular value of a random rectangular matrix
We prove an optimal estimate of the smallest singular value of a random sub‐Gaussian matrix, valid for all dimensions. For an N × n matrix A with independent and identically distributed sub‐Gaussian…