# Phase Retrieval Using Alternating Minimization in a Batch Setting

@article{Zhang2017PhaseRU,
title={Phase Retrieval Using Alternating Minimization in a Batch Setting},
author={Teng Zhang},
journal={2018 Information Theory and Applications Workshop (ITA)},
year={2017},
pages={1-17}
}
• Teng Zhang
• Published 25 June 2017
• Computer Science
• 2018 Information Theory and Applications Workshop (ITA)
This paper considers the problem of phase retrieval, where the goal is to recover a signal $z\in {\mathbb{c}}^n$ from the observations ${y}_{i}=\vert{a}_{i}^\ast{z}\vert, {i}=1,2,\ldots,{m}$. While many algorithms have been proposed, the alternating minimization algorithm has been one of the most commonly used methods, and it is very simple to implement. Current work [26] has proved that when the observation vectors $\{{a}_{i}\}_{i=1}^{m}$ are sampled from a complex Gaussian distribution $N… ## Figures from this paper • Computer Science ArXiv • 2019 Sparse phase retrieval plays an important role in many fields of applied science and thus attracts lots of attention. In this paper, we propose a \underline{sto}chastic alte\underline{r}nating • Computer Science NeurIPS • 2018 A convex optimization problem, where the objective function relies on an initial estimate of the true signal and also includes an additive regularization term to encourage structure, and the new formulation is referred to as regularized PhaseMax. • Teng Zhang • Mathematics, Computer Science Information and Inference: A Journal of the IMA • 2021 The connection between the convergence of the algorithm and the convexity of an objective function is established and it is demonstrated that when the sensing vectors are sampled uniformly from a unit sphere and the number of sensing vectors satisfies$m>O(n\log n)$as$n, m\rightarrow\infty$, then this algorithm with a good initialization achieves linear convergence to the solution with high probability. • Teng Zhang • Computer Science IEEE Transactions on Information Theory • 2020 If the phase retrieval problem is considered, it is shown that the classical algorithm of alternating minimization with random initialization succeeds with high probability as theinline-formula, which is a step toward proving the conjecture in, which conjectures that the algorithm succeeds when <inline- formula> <tex-math notation="LaTeX">$m=O(n)$</tex-Math></inline-Formula>. • Computer Science J. Mach. Learn. Res. • 2019 A new algorithm, dubbed accelerated alternating projections, is introduced for robust PCA which significantly improves the computational efficiency of the existing alternating projections proposed in [Netrapalli, Praneeth, et al., 2014] when updating the low rank factor. • Computer Science 2018 10th International Conference on Wireless Communications and Signal Processing (WCSP) • 2018 Experimental results clearly demonstrate the superiority of the proposed TNI estimator, which not only achieves a lower relative error of initialization but also outperforms the traditional methods in terms of accuracy and noise stability when measurements contaminated with noise. • Computer Science, Mathematics ArXiv • 2019 This work analyzes a natural alternating minimization (AM) algorithm for the non-convex least squares objective and shows that the AM algorithm, when initialized suitably, converges with high probability and at a geometric rate to a small ball around the optimal coefficients. • Computer Science 2019 XXII Symposium on Image, Signal Processing and Artificial Vision (STSIVA) • 2019 Numerical results show that the proposed coding elements are able to solve sparsity-based PR at any optical field under an admissible modulation and show the types of admissible coding elements that best estimate the support. • Computer Science ArXiv • 2019 It is shown that in some cases a generalization of Douglas-Rachford, called relaxed-reflect-reflect (RRR), can be viewed as gradient descent on a certain objective function. • Computer Science, Mathematics • 2022 A gradient regularized Newton method (GRNM) is developed to solve the least squares problem and it is proved that it converges to a unique local minimum at a superlinear rate under certain mild conditions. ## References SHOWING 1-10 OF 31 REFERENCES • Computer Science 2016 IEEE International Symposium on Information Theory (ISIT) • 2016 It is proved that when the measurement vectors are generic, with high probability, a natural least-squares formulation for GPR has the following benign geometric structure: (1) There are no spurious local minimizers, and all global minimizers are equal to the target signal, up to a global phase, and (2) the objective function has a negative directional curvature around each saddle point. • Computer Science ArXiv • 2016 It is shown that a fixed x_0 can be recovered exactly from corrupted magnitude measurements with high probability for$m = O(n)$, where$a_i \in \mathbb{R}^n$are i.i.d standard Gaussian and$\eta \in Â£m$has fixed sparse support. • Computer Science NIPS • 2016 It is proved that as soon as the number of equations$m$is on the order of the numberof unknowns$n, TGGF recovers the solution exactly (up to a global unimodular constant) with high probability and complexity growing linearly with the time required to read the data.
• Computer Science
ArXiv
• 2011
It is shown that in some instances, the combinatorial phase retrieval problem can be solved by convex programming techniques, and it is proved that the methodology is robust vis‐à‐vis additive noise.
• Computer Science
ArXiv
• 2015
A novel thresholded gradient descent algorithm is proposed and it is shown to adaptively achieve the minimax optimal rates of convergence over a wide range of sparsity levels when the a_j's are independent standard Gaussian random vectors, provided that the sample size is sufficiently large compared to the sparsity of \$x.
The projected gradient descent, when initialized in a neighborhood of the desired signal, converges to the unknown signal at a linear rate and is proved to be the first provably tractable algorithm for this data-poor regime.
• Computer Science, Mathematics
IEEE Transactions on Signal Processing
• 2015
This work represents the first theoretical guarantee for alternating minimization (albeit with resampling) for any variant of phase retrieval problems in the non-convex setting.
It is conjecture that the classical algorithm of alternating projections (Gerchberg–Saxton) succeeds with high probability when no special initialization procedure is used, and it is conjectured that this result is still true when nospecial initialization process is used.
• Mathematics
Found. Comput. Math.
• 2014
It is shown that any complex vector can be recovered exactly from on the order of n quadratic equations of the form |〈ai,x0〉|2=bi, i=1,…,m, by using a semidefinite program known as PhaseLift, improving upon earlier bounds.
• Computer Science
AISTATS
• 2017
A flexible convex relaxation for the phase retrieval problem that operates in the natural domain of the signal to avoid the prohibitive computational cost associated with "lifting" and semidefinite programming and compete with recently developed non-convex techniques for phase retrieval.