• Corpus ID: 225062535

On the robustness of noise-blind low-rank recovery from rank-one measurements

@article{Krahmer2020OnTR,
  title={On the robustness of noise-blind low-rank recovery from rank-one measurements},
  author={Felix Krahmer and Christian K{\"u}mmerle and Oleh Melnyk},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.12402}
}
We prove new results about the robustness of well-known convex noise-blind optimization formulations for the reconstruction of low-rank matrices from underdetermined linear measurements. Our results are applicable for symmetric rank-one measurements as used in a formulation of the phase retrieval problem. We obtain these results by establishing that with high probability rank-one measurement operators defined by i.i.d. Gaussian vectors exhibit the so-called Schatten-1 quotient property, which… 

Figures from this paper

On the robustness of minimum-norm interpolators
TLDR
A quantitative bound for the prediction error is given, relating it to the Rademacher complexity of the covariates, the norm of the minimum norm interpolator of the errors and the shape of the subdifferential around the true parameter.
On the robustness of minimum norm interpolators and regularized empirical risk minimizers
TLDR
A quantitative bound for the prediction error is given, relating it to the Rademacher complexity of the covariates, the norm of the minimum norm interpolator of the errors and the size of the subdifferential around the true parameter.
AdaBoost and robust one-bit compressed sensing
TLDR
The results provide an explanation why interpolating adversarial noise can be harmless for classification problems and improve convergence rates when the features satisfy a small deviation lower bound.

References

SHOWING 1-10 OF 67 REFERENCES
Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
TLDR
It is shown that properly constrained nuclear-norm minimization stably recovers a low-rank matrix from a constant number of noisy measurements per degree of freedom; this seems to be the first result of this nature.
Stable low-rank matrix recovery via null space properties
TLDR
It is shown that nuclear norm minimization uniformly and stably reconstructs Hermitian rank-$r$ matrices with high probability and discusses applications in quantum physics and the phase retrieval problem.
Robust Instance-Optimal Recovery of Sparse Signals at Unknown Noise Levels
TLDR
This work considers the problem of sparse signal recovery from noisy measurements and gives a recovery guarantee once the tuning parameter is above a threshold and analyzes the effect of a bad chosen tuning parameter mistuning on a theoretic level and proves the optimality of the recovery guarantee.
On the Convex Geometry of Blind Deconvolution and Matrix Completion
TLDR
This paper takes a novel, more geometric viewpoint to analyze both the matrix completion and the blind deconvolution scenario and finds that for both applications the dimension factors in the noise bounds are not an artifact of the proof, but the problems are intrinsically badly conditioned.
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
TLDR
An efficient implementation of an iteratively reweighted least squares algorithm for recovering a matrix from a small number of linear measurements designed for the simultaneous promotion of both a minimal nuclear norm and an approximately low-rank solution is presented.
Robustness to Unknown Error in Sparse Regularization
TLDR
Stability and robustness estimates are developed that prove the robustness of quadratically constrained basis pursuit under unknown error in the cases of random Gaussian matrices and of random matrices with heavy-tailed rows, such as random sampling matrices from bounded orthonormal systems.
Universal low-rank matrix recovery from Pauli measurements
TLDR
It is shown that almost all sets of O(rd log^6 d) Pauli measurements satisfy the rank-r restricted isometry property (RIP), which implies that M can be recovered from a fixed ("universal") set ofPauli measurements, using nuclear-norm minimization (e.g., the matrix Lasso), with nearly-optimal bounds on the error.
Blind Deconvolution Using Convex Programming
TLDR
It is proved that, for “generic” signals, the program can deconvolve w and x exactly when the maximum of N and K is almost on the order of L, and it is shown that if x is drawn from a random sub space of dimension N, and w is a vector in a subspace of dimension K whose basis vectors are spread out in the frequency domain, then nuclear norm minimization recovers wx* without error.
Quantum tomography via compressed sensing: error bounds, sample complexity and efficient estimators
TLDR
A new theoretical analysis of compressed tomography is presented, based on the restricted isometry property for low-rank matrices, and it is shown that unknown low- rank states can be reconstructed from an incomplete set of measurements, using techniques from compressed sensing and matrix completion.
...
...