Matrix estimation by Universal Singular Value Thresholding

@article{Chatterjee2015MatrixEB,
  title={Matrix estimation by Universal Singular Value Thresholding},
  author={Sourav Chatterjee},
  journal={Annals of Statistics},
  year={2015},
  volume={43},
  pages={177-214}
}
  • S. Chatterjee
  • Published 6 December 2012
  • Computer Science
  • Annals of Statistics
Consider the problem of estimating the entries of a large matrix, when the observed entries are noisy versions of a small random fraction of the original entries. This problem has received widespread attention in recent times, especially after the pioneering works of Emmanuel Cand\`{e}s and collaborators. This paper introduces a simple estimation procedure, called Universal Singular Value Thresholding (USVT), that works for any matrix that has "a little bit of structure." Surprisingly, this… 
Matrix completion by singular value thresholding: sharp bounds
TLDR
The goal of this paper is to provide strong theo-retical guarantees, similar to those obtained for nuclear-norm penalization methods and one step thresholding methods, for an iterative thresholding algorithm which is a modification of the softImpute algorithm.
Matrix completion with data-dependent missingness probabilities
TLDR
Two new estimators are proposed, based on singular value thresholding and nuclear norm minimization, to recover the matrix under this assumption of a single number p such that each entry of the matrix is available independently with probability p and missing otherwise.
Low-rank matrix completion and denoising under Poisson noise
TLDR
This paper considers the problem of estimating a low-rank matrix from the observation of all or a subset of its entries in the presence of Poisson noise, and identifies several estimators that have an upper error bound that depends on the matrix rank, the fraction of the elements observed and the maximal row and column sums of the true matrix.
Monotone Matrix Estimation via Robust Deconvolution
TLDR
This paper provides a simple, intuitive algorithm for matrix estimation which extends the works by Fan (1991) and Delaigle et al. (2008) and achieves near optimal minimax rate for the matrix estimation as well as robust deconvolution.
Matrix completion from a computational statistics perspective
TLDR
This review examines the success behind low‐rank matrix completion, one of the most studied and employed versions of matrix completion and sees opportunities to weaken the commonly enforced assumption of missing completely at random in matrix completion.
Adaptive shrinkage of singular values
TLDR
A generalized Stein unbiased risk estimation criterion is proposed that does not require knowledge of the variance of the noise and that is computationally fast and accurately estimates the rank of the signal when it is detectable.
Matrix Completion Under Monotonic Single Index Models
TLDR
This paper proposes a novel matrix completion method that alternates between low-rank matrix estimation and monotonic function estimation to estimate the missing matrix elements and demonstrates the competitiveness of the proposed approach.
Low Permutation-Rank Matrices: Structural Properties and Noisy Completion
TLDR
A richer model based on what is term the “permutation-rank” of a matrix is proposed and it is shown how the restrictions due to classical low rank assumptions can be avoided by using the richer permutations-rank model.
Asymptotic Theory for Estimating the Singular Vectors and Values of a Partially-observed Low Rank Matrix with Noise
TLDR
These estimators combine to form a consistent estimator of the full low rank matrix that is computed with a non-iterative algorithm and achieves the minimax lower bound in Koltchinskii et al. (2011).
Quantized matrix completion for low rank matrices
  • S. Bhaskar
  • Computer Science
    2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2015
TLDR
This paper proposes a globally convergent optimization algorithm exploiting existing work on low rank matrix factorization, and validate the method on synthetic data, with improved performance over past methods.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 118 REFERENCES
Estimation of high-dimensional low-rank matrices
TLDR
This work investigates penalized least squares estimators with a Schatten-p quasi-norm penalty term and derives bounds for the kth entropy numbers of the quasi-convex Schatten class embeddings S M p → S M 2 , p < 1, which are of independent interest.
Nuclear norm penalization and optimal rates for noisy low rank matrix completion
TLDR
A new nuclear norm penalized estimator of A_0 is proposed and a general sharp oracle inequality for this estimator is established for arbitrary values of $n,m_1,m-2$ under the condition of isometry in expectation to find the best trace regression model approximating the data.
The Power of Convex Relaxation: Near-Optimal Matrix Completion
TLDR
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors).
A Singular Value Thresholding Algorithm for Matrix Completion
TLDR
This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank, and develops a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
TLDR
Using the nuclear norm as a regularizer, the algorithm Soft-Impute iteratively replaces the missing elements with those obtained from a soft-thresholded SVD in a sequence of regularized low-rank solutions for large-scale matrix completion problems.
OptShrink: An Algorithm for Improved Low-Rank Signal Matrix Denoising by Optimal, Data-Driven Singular Value Shrinkage
  • R. Nadakuditi
  • Computer Science
    IEEE Transactions on Information Theory
  • 2014
TLDR
This analysis brings into sharp focus the shrinkage-and-thresholding form of the optimal weights, the nonconvex nature of the associated shrinkage function (on the singular values), and explains why matrix regularization via singular value thresholding with convex penalty functions will always be suboptimal.
Estimation of (near) low-rank matrices with noise and high-dimensional scaling
TLDR
Simulations show excellent agreement with the high-dimensional scaling of the error predicted by the theory, and illustrate their consequences for a number of specific learning models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low- rank matrices from random projections.
1-Bit Matrix Completion
TLDR
A theory of matrix completion for the extreme case of noisy 1-bit observations is developed and it is shown that the maximum likelihood estimate under a suitable constraint returns an accurate estimate of M when ||M||_{\infty} <= \alpha, and rank(M) <= r.
Matrix Completion from Noisy Entries
TLDR
This work studies a low complexity algorithm, introduced in [1], based on a combination of spectral techniques and manifold optimization, that is called here OPTSPACE, and proves performance guarantees that are order-optimal in a number of circumstances.
Matrix Completion With Noise
TLDR
This paper surveys the novel literature on matrix completion and introduces novel results showing that matrix completion is provably accurate even when the few observed entries are corrupted with a small amount of noise, and shows that, in practice, nuclear-norm minimization accurately fills in the many missing entries of large low-rank matrices from just a few noisy samples.
...
1
2
3
4
5
...