Corpus ID: 221090743

Optimal Combination of Linear and Spectral Estimators for Generalized Linear Models

@article{Mondelli2020OptimalCO,
  title={Optimal Combination of Linear and Spectral Estimators for Generalized Linear Models},
  author={Marco Mondelli and Christos Thrampoulidis and R. Venkataramanan},
  journal={ArXiv},
  year={2020},
  volume={abs/2008.03326}
}
We study the problem of recovering an unknown signal $\boldsymbol x$ given measurements obtained from a generalized linear model with a Gaussian sensing matrix. Two popular solutions are based on a linear estimator $\hat{\boldsymbol x}^{\rm L}$ and a spectral estimator $\hat{\boldsymbol x}^{\rm s}$. The former is a data-dependent linear combination of the columns of the measurement matrix, and its analysis is quite simple. The latter is the principal eigenvector of a data-dependent matrix, and… Expand

Figures from this paper

Approximate Message Passing with Spectral Initialization for Generalized Linear Models
TLDR
This paper proposes a two-phase artificial AMP algorithm that first produces the spectral estimator, and then closely approximates the iterates of the true AMP, and yields a rigorous characterization of the performance of AMP with spectral initialization in the high-dimensional limit. Expand
Generalization Guarantees for Neural Architecture Search with Train-Validation Split
TLDR
It is revealed that the upper-level problem helps select the most generalizable model and prevent overfitting with a near-minimal validation sample size and generalization bounds are established for continuous search spaces which are highly relevant for popular differentiable search schemes. Expand
PCA Initialization for Approximate Message Passing in Rotationally Invariant Models
TLDR
A two-phase artificial AMP is proposed that first approximates the PCA estimator and then mimics the true AMP, showing an excellent agreement between AMP results and theoretical predictions, and suggesting an interesting open direction on achieving Bayes-optimal performance. Expand
Stochasticity helps to navigate rough landscapes: comparing gradient-descent-based algorithms in the phase retrieval problem
TLDR
Dynamical mean-field theory from statistical physics is applied to characterize analytically the full trajectories of gradient-based algorithms in their continuous-time limit, with a warm start, and for large system sizes to unveil several intriguing properties of the landscape and the algorithms. Expand
Construction of optimal spectral methods in phase retrieval
TLDR
This paper combines the linearization of message-passing algorithms and the analysis of the Bethe Hessian, a classical tool of statistical physics, to derive optimal spectral methods for arbitrary channel noise and right-unitarily invariant matrix $\mathbf{\Phi}$, in an automated manner. Expand

References

SHOWING 1-10 OF 66 REFERENCES
Analysis of Spectral Methods for Phase Retrieval With Random Orthogonal Matrices
TLDR
This paper aims to obtain the same level of knowledge for isotropically random column-orthogonal matrices, which are substantially better models for practical phase retrieval systems. Expand
Recovering Structured Data From Superimposed Non-Linear Measurements
TLDR
Two algorithmic approaches are considered, a special instance of a more abstract framework which includes sub-Gaussian measurement designs as well as general (convex) structural constraints, which are of independent interest for various recovery and learning tasks, as they apply to arbitrary non-linear observation models. Expand
A modern maximum-likelihood theory for high-dimensional logistic regression
  • P. Sur, E. Candès
  • Mathematics, Medicine
  • Proceedings of the National Academy of Sciences
  • 2019
TLDR
It is proved that the maximum-likelihood estimate (MLE) is biased, the variability of the MLE is far greater than classically estimated, and the likelihood-ratio test (LRT) is not distributed as a χ2. Expand
Lifting high-dimensional non-linear models with Gaussian regressors
TLDR
This paper proposes and analyzes an alternative convex recovery method for link functions that treats such link functions as if they were linear in a lifted space of higher-dimension and error analysis captures the effect of both the nonlinearity and the problem's geometry in a few simple summary parameters. Expand
Optimal Spectral Initialization for Signal Recovery With Applications to Phase Retrieval
TLDR
This paper leverages recent results that provide an exact characterization of the performance of the spectral method in the high-dimensional limit to map the task of optimal design to a constrained optimization problem in a weighted L^2 function space. Expand
Optimal errors and phase transitions in high-dimensional generalized linear models
TLDR
R rigorously establishes rigorously the intrinsic information-theoretic limitations of inference and learning for a class of randomly generated instances of generalized linear models, thus closing several decades-old conjectures. Expand
Finite Sample Analysis of Approximate Message Passing Algorithms
TLDR
A concentration inequality for AMP with Gaussian matrices with independent and identically distributed entries and finite dimension shows that the probability of deviation from the state evolution prediction falls exponentially in inline-formula, providing theoretical support for empirical findings that have demonstrated excellent agreement of AMP performance with state evolution predictions for moderately large dimensions. Expand
Fundamental Limits of Weak Recovery with Applications to Phase Retrieval
TLDR
It is proved that—in the high-dimensional limit—a sharp phase transition takes place, and the threshold of the proposed spectral method is located in the regime of vanishingly small noise. Expand
Generalized linear models, Routledge
  • 2018
High-Dimensional Probability: An Introduction with Applications in Data Science
TLDR
A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression. Expand
...
1
2
3
4
5
...