Phase Retrieval Using Alternating Minimization in a Batch Setting

@article{Zhang2017PhaseRU,
  title={Phase Retrieval Using Alternating Minimization in a Batch Setting},
  author={Teng Zhang},
  journal={2018 Information Theory and Applications Workshop (ITA)},
  year={2017},
  pages={1-17}
}
  • Teng Zhang
  • Published 25 June 2017
  • Computer Science
  • 2018 Information Theory and Applications Workshop (ITA)
This paper considers the problem of phase retrieval, where the goal is to recover a signal $z\in {\mathbb{c}}^n$ from the observations ${y}_{i}=\vert{a}_{i}^\ast{z}\vert, {i}=1,2,\ldots,{m}$. While many algorithms have been proposed, the alternating minimization algorithm has been one of the most commonly used methods, and it is very simple to implement. Current work [26] has proved that when the observation vectors $\{{a}_{i}\}_{i=1}^{m}$ are sampled from a complex Gaussian distribution $N… 

Figures from this paper

A stochastic alternating minimizing method for sparse phase retrieval

Sparse phase retrieval plays an important role in many fields of applied science and thus attracts lots of attention. In this paper, we propose a \underline{sto}chastic alte\underline{r}nating

Learning without the Phase: Regularized PhaseMax Achieves Optimal Sample Complexity

A convex optimization problem, where the objective function relies on an initial estimate of the true signal and also includes an additive regularization term to encourage structure, and the new formulation is referred to as regularized PhaseMax.

Phase retrieval of complex-valued objects via a randomized Kaczmarz method

  • Teng Zhang
  • Mathematics, Computer Science
    Information and Inference: A Journal of the IMA
  • 2021
The connection between the convergence of the algorithm and the convexity of an objective function is established and it is demonstrated that when the sensing vectors are sampled uniformly from a unit sphere and the number of sensing vectors satisfies $m>O(n\log n)$ as $n, m\rightarrow\infty$, then this algorithm with a good initialization achieves linear convergence to the solution with high probability.

Phase Retrieval by Alternating Minimization With Random Initialization

  • Teng Zhang
  • Computer Science
    IEEE Transactions on Information Theory
  • 2020
If the phase retrieval problem is considered, it is shown that the classical algorithm of alternating minimization with random initialization succeeds with high probability as theinline-formula, which is a step toward proving the conjecture in, which conjectures that the algorithm succeeds when <inline- formula> <tex-math notation="LaTeX">$m=O(n)$ </tex-Math></inline-Formula>.

Accelerated Alternating Projections for Robust Principal Component Analysis

A new algorithm, dubbed accelerated alternating projections, is introduced for robust PCA which significantly improves the computational efficiency of the existing alternating projections proposed in [Netrapalli, Praneeth, et al., 2014] when updating the low rank factor.

Phase Retrieval via a Modified Null Vector Estimator

Experimental results clearly demonstrate the superiority of the proposed TNI estimator, which not only achieves a lower relative error of initialization but also outperforms the traditional methods in terms of accuracy and noise stability when measurements contaminated with noise.

Max-Affine Regression: Provable, Tractable, and Near-Optimal Statistical Estimation

This work analyzes a natural alternating minimization (AM) algorithm for the non-convex least squares objective and shows that the AM algorithm, when initialized suitably, converges with high probability and at a geometric rate to a small ball around the optimal coefficients.

Sparsity-based Phase Retrieval from Diffractive Optical Imaging

Numerical results show that the proposed coding elements are able to solve sparsity-based PR at any optical field under an admissible modulation and show the types of admissible coding elements that best estimate the support.

A note on Douglas-Rachford, subgradients, and phase retrieval

It is shown that in some cases a generalization of Douglas-Rachford, called relaxed-reflect-reflect (RRR), can be viewed as gradient descent on a certain objective function.

An Oracle Gradient Regularized Newton Method for Quadratic Measurements Regression

A gradient regularized Newton method (GRNM) is developed to solve the least squares problem and it is proved that it converges to a unique local minimum at a superlinear rate under certain mild conditions.

References

SHOWING 1-10 OF 31 REFERENCES

A Geometric Analysis of Phase Retrieval

It is proved that when the measurement vectors are generic, with high probability, a natural least-squares formulation for GPR has the following benign geometric structure: (1) There are no spurious local minimizers, and all global minimizers are equal to the target signal, up to a global phase, and (2) the objective function has a negative directional curvature around each saddle point.

Corruption Robust Phase Retrieval via Linear Programming

It is shown that a fixed x_0 can be recovered exactly from corrupted magnitude measurements with high probability for $m = O(n)$, where $a_i \in \mathbb{R}^n$ are i.i.d standard Gaussian and $\eta \in £m$ has fixed sparse support.

Solving Random Systems of Quadratic Equations via Truncated Generalized Gradient Flow

It is proved that as soon as the number of equations $m$ is on the order of the numberof unknowns $n, TGGF recovers the solution exactly (up to a global unimodular constant) with high probability and complexity growing linearly with the time required to read the data.

PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming

It is shown that in some instances, the combinatorial phase retrieval problem can be solved by convex programming techniques, and it is proved that the methodology is robust vis‐à‐vis additive noise.

Optimal Rates of Convergence for Noisy Sparse Phase Retrieval via Thresholded Wirtinger Flow

A novel thresholded gradient descent algorithm is proposed and it is shown to adaptively achieve the minimax optimal rates of convergence over a wide range of sparsity levels when the a_j's are independent standard Gaussian random vectors, provided that the sample size is sufficiently large compared to the sparsity of $x.

Structured Signal Recovery From Quadratic Measurements: Breaking Sample Complexity Barriers via Nonconvex Optimization

The projected gradient descent, when initialized in a neighborhood of the desired signal, converges to the unknown signal at a linear rate and is proved to be the first provably tractable algorithm for this data-poor regime.

Phase Retrieval Using Alternating Minimization

This work represents the first theoretical guarantee for alternating minimization (albeit with resampling) for any variant of phase retrieval problems in the non-convex setting.

Phase Retrieval With Random Gaussian Sensing Vectors by Alternating Projections

It is conjecture that the classical algorithm of alternating projections (Gerchberg–Saxton) succeeds with high probability when no special initialization procedure is used, and it is conjectured that this result is still true when nospecial initialization process is used.

Solving Quadratic Equations via PhaseLift When There Are About as Many Equations as Unknowns

It is shown that any complex vector can be recovered exactly from on the order of n quadratic equations of the form |〈ai,x0〉|2=bi, i=1,…,m, by using a semidefinite program known as PhaseLift, improving upon earlier bounds.

Phase Retrieval Meets Statistical Learning Theory: A Flexible Convex Relaxation

A flexible convex relaxation for the phase retrieval problem that operates in the natural domain of the signal to avoid the prohibitive computational cost associated with "lifting" and semidefinite programming and compete with recently developed non-convex techniques for phase retrieval.