MMSE of probabilistic low-rank matrix estimation: Universality with respect to the output channel

@article{Lesieur2015MMSEOP,
  title={MMSE of probabilistic low-rank matrix estimation: Universality with respect to the output channel},
  author={Thibault Lesieur and Florent Krzakala and Lenka Zdeborov{\'a}},
  journal={2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton)},
  year={2015},
  pages={680-687}
}
This paper considers probabilistic estimation of a low-rank matrix from non-linear element-wise measurements of its elements. We derive the corresponding approximate message passing (AMP) algorithm and its state evolution. Relying on non-rigorous but standard assumptions motivated by statistical physics, we characterize the minimum mean squared error (MMSE) achievable information theoretically and with the AMP algorithm. Unlike in related problems of linear estimation, in the present setting… 

Figures from this paper

Mismatched Estimation of rank-one symmetric matrices under Gaussian noise
TLDR
The full exact analytic expression of the asymptotic mean squared error (MSE) is derived in the large system size limit for the particular case of Gaussian priors and additive noise.
Mutual information in rank-one matrix estimation
TLDR
It is proved that the Bethe mutual information always yields an upper bound to the exact mutual information, using an interpolation method proposed by Guerra and later refined by Korada and Macris, in the case of rank-one symmetric matrix estimation.
Fundamental limits of symmetric low-rank matrix estimation
TLDR
This paper considers the high-dimensional inference problem where the signal is a low-rank symmetric matrix which is corrupted by an additive Gaussian noise and compute the limit in the large dimension setting for the mutual information between the signal and the observations, while the rank of the signal remains constant.
Rank-one matrix estimation: analysis of algorithmic and information theoretic limits by the spatial coupling method
TLDR
The spatial coupling methodology developed in the framework of error correcting codes is used, to rigorously derive the mutual information for the symmetric rank-one case and shows that the computational gap vanishes for the proposed spatially coupled model, a promising feature with many possible applications.
Fundamental limits of low-rank matrix estimation: the non-symmetric case
TLDR
This work considers the high-dimensional inference problem where the signal is a low-rank matrix which is corrupted by an additive Gaussian noise and compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error.
Mutual information for symmetric rank-one matrix estimation: A proof of the replica formula
TLDR
It is shown how to rigorously prove the conjectured formula for the symmetric rank-one case, which allows to express the minimal mean-square-error and to characterize the detectability phase transitions in a large set of estimation problems ranging from community detection to sparse PCA.
Phase transitions in spiked matrix estimation: information-theoretic analysis
TLDR
The minimal mean squared error is computed for the estimation of the low-rank signal and it is compared to the performance of spectral estimators and message passing algorithms.
Phase Transitions and Sample Complexity in Bayes-Optimal Matrix Factorization
TLDR
This work compute the minimal mean-squared-error achievable, in principle, in any computational time, and the error that can be achieved by an efficient approximate message passing algorithm based on the asymptotic state-evolution analysis of the algorithm.
Constrained Low-rank Matrix Estimation: Phase Transitions, Approximate Message Passing and Applications
TLDR
The derivation of the TAP equations for models as different as the Sherrington-Kirkpatrick model, the restricted Boltzmann machine, the Hopfield model or vector (xy, Heisenberg and other) spin glasses are unify.
Information-Theoretic Bounds and Phase Transitions in Clustering, Sparse PCA, and Submatrix Localization
TLDR
The upper bounds show that for each of these problems there is a significant regime where reliable detection is information-theoretically possible but where known algorithms such as PCA fail completely, since the spectrum of the observed matrix is uninformative.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 32 REFERENCES
Iterative estimation of constrained rank-one matrices in noise
  • S. Rangan, A. Fletcher
  • Computer Science
    2012 IEEE International Symposium on Information Theory Proceedings
  • 2012
TLDR
This work considers the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix and proposes a simple iterative procedure that reduces the problem to a sequence of scalar estimation computations.
Phase transitions in sparse PCA
TLDR
It is shown that both for low density and for large rank the problem undergoes a series of phase transitions suggesting existence of a region of parameters where estimation is information theoretically possible, but AMP (and presumably every other polynomial algorithm) fails.
Generalized approximate message passing for estimation with random linear mixing
  • S. Rangan
  • Computer Science
    2011 IEEE International Symposium on Information Theory Proceedings
  • 2011
TLDR
G-AMP incorporates general measurement channels and shows that the asymptotic behavior of the G-AMP algorithm under large i.i.d. measurement channels is similar to the AWGN output channel case, and Gaussian transform matrices is described by a simple set of state evolution (SE) equations.
Phase Transitions and Sample Complexity in Bayes-Optimal Matrix Factorization
TLDR
This work compute the minimal mean-squared-error achievable, in principle, in any computational time, and the error that can be achieved by an efficient approximate message passing algorithm based on the asymptotic state-evolution analysis of the algorithm.
Information-theoretically optimal sparse PCA
TLDR
This work analyzes an Approximate Message Passing algorithm to estimate the underlying signal and shows, in the high dimensional limit, that the AMP estimates are information-theoretically optimal and effectively provides a single-letter characterization of the sparse PCA problem.
Adaptive damping and mean removal for the generalized approximate message passing algorithm
TLDR
Numerical results demonstrate significantly enhanced robustness to non-zero-mean, rank-deficient, column-correlated, and ill-conditioned A and propose adaptive-damping and mean-removal strategies that aim to prevent divergence.
Low-rank matrix reconstruction and clustering via approximate message passing
TLDR
This work proposes an efficient approximate message passing algorithm, derived from the belief propagation algorithm, to perform the Bayesian inference for matrix reconstruction and successfully applied the proposed algorithm to a clustering problem, by reformulating it as a low-rank matrix reconstruction problem with an additional structural property.
The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
TLDR
This paper proves that indeed it holds asymptotically in the large system limit for sensing matrices with independent and identically distributed Gaussian entries, and provides rigorous foundation to state evolution.
Computational Barriers in Minimax Submatrix Detection
TLDR
The minimax detection of a small submatrix of elevated mean in a large matrix contaminated by additive Gaussian noise is studied and it is shown that the hardness of attaining the minimax estimation rate can crucially depend on the loss function.
Message-passing algorithms for compressed sensing
TLDR
A simple costless modification to iterative thresholding is introduced making the sparsity–undersampling tradeoff of the new algorithms equivalent to that of the corresponding convex optimization procedures, inspired by belief propagation in graphical models.
...
1
2
3
4
...