# Memory AMP

@article{Liu2022MemoryA, title={Memory AMP}, author={Lei Liu and Shunqi Huang and Brian M. Kurkoski}, journal={IEEE Transactions on Information Theory}, year={2022} }

—Approximate message passing (AMP) is a low- cost iterative parameter-estimation technique for certain high-dimensional linear systems with non-Gaussian distributions. AMP only applies to independent identically distributed (IID) transform matrices, but may become unreliable (e.g., perform poorly or even diverge) for other matrix ensembles, especially for ill-conditioned ones. To solve this issue, orthogonal/vector AMP (OAMP/VAMP) was proposed for general right-unitarily-invariant matrices…

## Figures and Tables from this paper

## 4 Citations

### Model-Driven Deep Learning-Based MIMO-OFDM Detector: Design, Simulation, and Experimental Results

- Computer ScienceIEEE Transactions on Communications
- 2022

Simulation results and complexity analysis show that the proposed scheme has significant gain over other iterative detection methods and exhibits comparable performance to the state-of-the-art DL-based detector at a reduced computational cost.

### Capacity Optimal Generalized Multi-User MIMO: A Theoretical and Practical Framework

- Computer ScienceArXiv
- 2021

A uniﬁed framework is proposed to derive the constrained capacity region of GMU-MIMO and design a constrained-capacity-optimal transceiver, which jointly considers encoding, modulation, detection, and decoding.

### Sufficient Statistic Memory Approximate Message Passing

- Computer Science2022 IEEE International Symposium on Information Theory (ISIT)
- 2022

To solve the convergence problem of AMP-type algorithms in principle, this paper proposes a memory AMP under a sufficient statistic condition, named sufficient statistic MAMP (SS-MAMP), and shows that the covariance matrices of SS-Mamp are L-banded and convergent.

### Sufficient Statistic Memory AMP

- Computer ScienceArXiv
- 2021

Two interesting properties of BO-OAMP/VAMP for large systems are revealed: the covariance matrices are L-banded and are convergent, and damping and memory are not needed (i.e., do not bring performance improvement).

## References

SHOWING 1-10 OF 67 REFERENCES

### Compressed Sensing With Upscaled Vector Approximate Message Passing

- Computer ScienceIEEE Transactions on Information Theory
- 2022

This work considers the problem of upscaling VAMP by utilizing Conjugate Gradient (CG) to approximate the intractable LMMSE estimator and proposes a rigorous method for correcting and tuning CG withing CG-VAMP to achieve a stable and efficient reconstruction.

### Bayes-Optimal Convolutional AMP

- Computer ScienceIEEE Transactions on Information Theory
- 2021

For sensing matrices with low-to-moderate condition numbers, CAMP can achieve the same performance as high-complexity orthogonal/vector AMP that requires the linear minimum mean-square error (LMMSE) filter instead of the MF.

### A Unified Framework of State Evolution for Message-Passing Algorithms

- Computer Science2019 IEEE International Symposium on Information Theory (ISIT)
- 2019

A unified framework to understand the dynamics of message-passing algorithms in compressed sensing is presented and AMP is proved to converge asymptotically if the sensing matrix is orthogonally invariant and if the moment sequence of its asymPTotic singular-value distribution coincide with that of the Marčhenko-Pastur distribution.

### Rigorous dynamics of expectation-propagation-based signal recovery from unitarily invariant measurements

- Computer Science2017 IEEE International Symposium on Information Theory (ISIT)
- 2017

The main result is the justification of an SE formula conjectured by Ma and Ping for an EP-based message-passing algorithm in the large system limit, where both input and output dimensions tend to infinity at an identical speed.

### Estimation of the Mean of a Multivariate Normal Distribution

- Mathematics
- 1981

### On the Convergence of Orthogonal/Vector AMP: Long-Memory Message-Passing Strategy

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2022

The convergence of Bayes-optimal orthogonal/vector approximate message-passing (AMP) to a fixed point in the large system limit is proved by confirming an exact reduction of the state evolution recursions to those for Baye’s orthogonic/vector AMP.

### Vector approximate message passing

- Computer Science2017 IEEE International Symposium on Information Theory (ISIT)
- 2017

This paper considers a “vector AMP” (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant.

### Orthogonal AMP

- Computer ScienceIEEE Access
- 2017

An orthogonal AMP (OAMP) algorithm based on de-correlated linear estimation (LE) and divergence-free non-linear estimation (NLE), which demonstrates that OAMP can be advantageous over AMP, especially for ill-conditioned matrices.

### A theory of solving TAP equations for Ising models with general invariant random matrices

- Computer ScienceArXiv
- 2015

An analysis of iterative algorithms using a dynamical functional approach that in the thermodynamic limit yields an effective dynamics of a single variable trajectory.

### The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing

- Computer ScienceIEEE Transactions on Information Theory
- 2010

This paper proves that indeed it holds asymptotically in the large system limit for sensing matrices with independent and identically distributed Gaussian entries, and provides rigorous foundation to state evolution.