# Phase Retrieval Using Alternating Minimization in a Batch Setting

@article{Zhang2017PhaseRU,
title={Phase Retrieval Using Alternating Minimization in a Batch Setting},
author={Teng Zhang},
journal={2018 Information Theory and Applications Workshop (ITA)},
year={2017},
pages={1-17}
}
• Teng Zhang
• Published 25 June 2017
• Computer Science
• 2018 Information Theory and Applications Workshop (ITA)
This paper considers the problem of phase retrieval, where the goal is to recover a signal $z\in {\mathbb{c}}^n$ from the observations ${y}_{i}=\vert{a}_{i}^\ast{z}\vert, {i}=1,2,\ldots,{m}$. While many algorithms have been proposed, the alternating minimization algorithm has been one of the most commonly used methods, and it is very simple to implement. Current work  has proved that when the observation vectors $\{{a}_{i}\}_{i=1}^{m}$ are sampled from a complex Gaussian distribution $N… • Computer Science ArXiv • 2019 Sparse phase retrieval plays an important role in many fields of applied science and thus attracts lots of attention. In this paper, we propose a \underline{sto}chastic alte\underline{r}nating • Computer Science NeurIPS • 2018 A convex optimization problem, where the objective function relies on an initial estimate of the true signal and also includes an additive regularization term to encourage structure, and the new formulation is referred to as regularized PhaseMax. • Teng Zhang • Mathematics, Computer Science Information and Inference: A Journal of the IMA • 2021 The connection between the convergence of the algorithm and the convexity of an objective function is established and it is demonstrated that when the sensing vectors are sampled uniformly from a unit sphere and the number of sensing vectors satisfies$m>O(n\log n)$as$n, m\rightarrow\infty$, then this algorithm with a good initialization achieves linear convergence to the solution with high probability. • Teng Zhang • Computer Science IEEE Transactions on Information Theory • 2020 If the phase retrieval problem is considered, it is shown that the classical algorithm of alternating minimization with random initialization succeeds with high probability as theinline-formula, which is a step toward proving the conjecture in, which conjectures that the algorithm succeeds when <inline- formula> <tex-math notation="LaTeX">$m=O(n)\$ </tex-Math></inline-Formula>.
• Computer Science
J. Mach. Learn. Res.
• 2019
A new algorithm, dubbed accelerated alternating projections, is introduced for robust PCA which significantly improves the computational efficiency of the existing alternating projections proposed in [Netrapalli, Praneeth, et al., 2014] when updating the low rank factor.
• Computer Science
2018 10th International Conference on Wireless Communications and Signal Processing (WCSP)
• 2018
Experimental results clearly demonstrate the superiority of the proposed TNI estimator, which not only achieves a lower relative error of initialization but also outperforms the traditional methods in terms of accuracy and noise stability when measurements contaminated with noise.
• Computer Science, Mathematics
ArXiv
• 2019
This work analyzes a natural alternating minimization (AM) algorithm for the non-convex least squares objective and shows that the AM algorithm, when initialized suitably, converges with high probability and at a geometric rate to a small ball around the optimal coefficients.
• Computer Science
2019 XXII Symposium on Image, Signal Processing and Artificial Vision (STSIVA)
• 2019
Numerical results show that the proposed coding elements are able to solve sparsity-based PR at any optical field under an admissible modulation and show the types of admissible coding elements that best estimate the support.
• Computer Science
ArXiv
• 2019
It is shown that in some cases a generalization of Douglas-Rachford, called relaxed-reflect-reflect (RRR), can be viewed as gradient descent on a certain objective function.
• Computer Science, Mathematics
• 2022
A gradient regularized Newton method (GRNM) is developed to solve the least squares problem and it is proved that it converges to a unique local minimum at a superlinear rate under certain mild conditions.