Constructing confidence sets for the matrix completion problem

@inproceedings{Carpentier2017ConstructingCS,
  title={Constructing confidence sets for the matrix completion problem},
  author={Alexandra Carpentier and Olga Klopp and Matthias Loffler},
  year={2017}
}
In the present note we consider the problem of constructing honest and adaptive confidence sets for the matrix completion problem. For the Bernoulli model with known variance of the noise we provide a realizable method for constructing confidence sets that adapt to the unknown rank of the true matrix. 
Matrix completion with data-dependent missingness probabilities
TLDR
Two new estimators are proposed, based on singular value thresholding and nuclear norm minimization, to recover the matrix under this assumption of a single number p such that each entry of the matrix is available independently with probability p and missing otherwise.
Inference and uncertainty quantification for noisy matrix completion
TLDR
A simple procedure to compensate for the bias of the widely used convex and nonconvex estimators and derive distributional characterizations for the resulting debiased estimators, which enable optimal construction of confidence intervals/regions for the missing entries and the low-rank factors.
Matrix Completion with Quantified Uncertainty through Low Rank Gaussian Copula
TLDR
A probabilistic and scalable framework for missing value imputation with quantified uncertainty, augments a standard probabilism model, Probabilistic Principal Component Analysis, with marginal transformations for each column that allow the model to better match the distribution of the data.
Inference for linear forms of eigenvectors under minimal eigenvalue separation: Asymmetry and heteroscedasticity
TLDR
This work develops algorithms that produce confidence intervals for linear forms of individual eigenvectors, based on eigen-decomposition of the asymmetric data matrix followed by a careful de-biasing scheme, and establishes procedures to construct optimalconfidence intervals for the eigenvalues of interest.
Tackling Small Eigen-Gaps: Fine-Grained Eigenvector Estimation and Inference Under Heteroscedastic Noise
TLDR
Based on eigen-decomposition of the asymmetric data matrix, this paper proposes estimation and uncertainty quantification procedures for an unknown eigenvector, which further allow us to reason about linear functionals of an unknown Eigenvector.

References

SHOWING 1-10 OF 18 REFERENCES
Adaptive confidence sets for matrix completion
TLDR
Borders for the minimax rates of certain composite hypothesis testing problems arising in low rank inference are obtained and it is proved that in the Bernoulli model, honest and adaptive confidence sets exist only when the error variance is known a priori.
Noisy low-rank matrix completion with general sampling distribution
  • O. Klopp
  • Mathematics, Computer Science
  • 2014
TLDR
It is proved that, up to a logarithmic factor, the new nuclear-norm penalized estimators achieve optimal rates with respect to the estimation error.
The Power of Convex Relaxation: Near-Optimal Matrix Completion
TLDR
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors).
Matrix completion by singular value thresholding: sharp bounds
TLDR
The goal of this paper is to provide strong theo-retical guarantees, similar to those obtained for nuclear-norm penalization methods and one step thresholding methods, for an iterative thresholding algorithm which is a modification of the softImpute algorithm.
Matrix estimation by Universal Singular Value Thresholding
TLDR
This paper introduces a simple estimation procedure, called Universal Singular Value Thresholding (USVT), that works for any matrix that has "a little bit of structure" and achieves the minimax error rate up to a constant factor.
Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements
TLDR
It is shown that properly constrained nuclear-norm minimization stably recovers a low-rank matrix from a constant number of noisy measurements per degree of freedom; this seems to be the first result of this nature.
Nuclear norm penalization and optimal rates for noisy low rank matrix completion
TLDR
A new nuclear norm penalized estimator of A_0 is proposed and a general sharp oracle inequality for this estimator is established for arbitrary values of $n,m_1,m-2$ under the condition of isometry in expectation to find the best trace regression model approximating the data.
A new look at independence
The concentration of measure phenomenon in product spaces is a farreaching abstract generalization of the classical exponential inequalities for sums of independent random variables. We attempt to
Stability of matrix factorization for collaborative filtering
TLDR
The stability vis a vis adversarial noise of matrix factorization algorithm for matrix completion is studied and the prediction error of individual users based on the subspace stability is analyzed to solve the problem of collaborative filtering under manipulator attack.
Matrix completion via max-norm constrained optimization
TLDR
It is shown that the max-norm constrained method is minimax rate-optimal and yields a unified and robust approximate recovery guarantee, with respect to the sampling distributions.
...
...