Corpus ID: 224803172

A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval

@article{Wu2020ACM,
  title={A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval},
  author={Fan Wu and Patrick Rebeschini},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.10168}
}
We analyze continuous-time mirror descent applied to sparse phase retrieval, which is the problem of recovering sparse signals from a set of magnitude-only measurements. We apply mirror descent to the unconstrained empirical risk minimization problem (batch setting), using the square loss and square measurements. We provide a convergence analysis of the algorithm in this non-convex setting and prove that, with the hypentropy mirror map, mirror descent recovers any $k$-sparse vector $\mathbf{x… Expand

Figures from this paper

Implicit Regularization in Matrix Sensing via Mirror Descent
TLDR
It is shown that gradient descent with full-rank factorized parametrization is a first-order approximation to mirror descent, in which case an explicit characterization of the implicit bias of gradient flow as a by-product is obtained. Expand
Implicit Bias of SGD for Diagonal Linear Networks: a Provable Benefit of Stochasticity
TLDR
The findings highlight the fact that structured noise can induce better generalisation and they help explain the greater performances observed in practice of stochastic gradient descent over gradient descent. Expand
Approximate Message Passing with Spectral Initialization for Generalized Linear Models
TLDR
This paper proposes a two-phase artificial AMP algorithm that first produces the spectral estimator, and then closely approximates the iterates of the true AMP, and yields a rigorous characterization of the performance of AMP with spectral initialization in the high-dimensional limit. Expand

References

SHOWING 1-10 OF 69 REFERENCES
Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via Thresholded Wirtinger Flow
TLDR
A novel thresholded gradient descent algorithm is proposed and it is shown to adaptively achieve the minimax optimal rates of convergence over a wide range of sparsity levels when the a_j's are independent standard Gaussian random vectors, provided that the sample size is sufficiently large compared to the sparsity of $x. Expand
Implicit Regularization for Optimal Sparse Recovery
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from anExpand
Algorithmic Regularization in Over-parameterized Matrix Sensing and Neural Networks with Quadratic Activations
TLDR
The gradient descent algorithm provides an implicit regularization effect in the learning of over-parameterized matrix factorization models and one-hidden-layer neural networks with quadratic activations and the results solve the conjecture of Gunasekar et al. Expand
Phase Retrieval via Sparse Wirtinger Flow
TLDR
Numerical tests demonstrate that SWF performs better than state-of-the-art methods especially when the authors have no priori knowledge about sparsity $k$ and is also robust to the noise. Expand
Phase Retrieval via Wirtinger Flow: Theory and Algorithms
TLDR
This paper develops a nonconvex formulation of the phase retrieval problem as well as a concrete solution algorithm that is shown to rigorously allow the exact retrieval of phase information from a nearly minimal number of random measurements. Expand
Phase Retrieval Using Alternating Minimization
TLDR
This work represents the first theoretical guarantee for alternating minimization (albeit with resampling) for any variant of phase retrieval problems in the non-convex setting. Expand
Sparse Phase Retrieval via Truncated Amplitude Flow
TLDR
A novel algorithm to reconstruct a sparse signal from a small number of magnitude-only measurements, SPARTA is a simple yet effective, scalable, and fast sparse PR solver that is robust against additive noise of bounded support. Expand
Compressive Phase Retrieval via Reweighted Amplitude Flow
TLDR
A new algorithm that is termed compressive reweighted amplitude flow (CRAF) and abbreviated as CRAF, for compressive phase retrieval, which seeks a sparse initial guess via a new spectral procedure and recovers the underlying signal vector exactly with high probability under suitable conditions. Expand
Implicit Regularization in Nonconvex Statistical Estimation: Gradient Descent Converges Linearly for Phase Retrieval and Matrix Completion
TLDR
Focusing on two statistical estimation problems, i.e. solving random quadratic systems of equations and low-rank matrix completion, it is established that gradient descent achieves near-optimal statistical and computational guarantees without explicit regularization. Expand
GESPAR: Efficient Phase Retrieval of Sparse Signals
TLDR
This work proposes a fast local search method for recovering a sparse signal from measurements of its Fourier transform (or other linear transform) magnitude which it refers to as GESPAR: GrEedy Sparse PhAse Retrieval, which does not require matrix lifting, unlike previous approaches, and therefore is potentially suitable for large scale problems such as images. Expand
...
1
2
3
4
5
...