# Sparse Regression Codes

@article{Venkataramanan2019SparseRC, title={Sparse Regression Codes}, author={Ramji Venkataramanan and Sekhar Chandra Tatikonda and Andrew R. Barron}, journal={Found. Trends Commun. Inf. Theory}, year={2019}, volume={15}, pages={1-195} }

Developing computationally-efficient codes that approach the Shannon-theoretic limits for communication and compression has long been one of the major goals of information and coding theory. [] Key Method In the third part, SPARCs are used to construct codes for Gaussian multi-terminal channel and source coding models such as broadcast channels, multiple-access channels, and source and channel coding with side information. The survey concludes with a discussion of open problems and directions for future work…

## 22 Citations

### Sparse Superposition Codes for the Gaussian Channel

- Computer Science
- 2018

This thesis aims to understand the construction, encoding, power allocation, and decoding of SPARCs for the unconstrained Gaussian channel from the recent literature, and to investigate SP ARCs for various block sizes to understand how those codes would work in practical applications with strong delay constraints.

### Capacity-Achieving Spatially Coupled Sparse Superposition Codes With AMP Decoding

- Computer ScienceIEEE Transactions on Information Theory
- 2021

A non-asymptotic bound on the probability of error is obtained and it is proved that spatially coupled SPARCs with AMP decoding achieve the capacity of the AWGN channel.

### Modulated Sparse Superposition Codes for the Complex AWGN Channel

- Computer ScienceIEEE Transactions on Information Theory
- 2021

The results show that introducing modulation to the SPARC design can significantly reduce decoding complexity without sacrificing error performance, and asymptotically capacity achieving for the complex AWGN channel.

### On Sparse Regression LDPC Codes

- Computer ScienceArXiv
- 2023

A novel concatenated coding structure that combines an LDPC outer code with a SPARC-inspired inner code is introduced and results in a steep waterfall in error performance, a phenomenon not observed in uncoded SPARCs.

### Unsourced Multiuser Sparse Regression Codes achieve the Symmetric MAC Capacity

- Computer Science2020 IEEE International Symposium on Information Theory (ISIT)
- 2020

This work shows that this concatenated coding construction for U-RA can achieve a vanishing per-user error probability in the limit of large blocklength and a large number of active users at sum-rates up to the symmetric Shannon capacity, and calculates the algorithmic threshold, that is a bound on the sum-rate up to which the inner decoding can be done reliably with the low-complexity AMP algorithm.

### Secure Coding for the Gaussian Wiretap Channel with Sparse Regression Codes

- Computer Science2021 55th Annual Conference on Information Sciences and Systems (CISS)
- 2021

The Gaussian wiretap channel (WTC), where two legitimate users communicate in the presence of a passive eavesdropper, is considered and the nested structure that exhibit and a coding scheme which achieves a level of information-theoretic security and reliability simultaneously is proposed.

### Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation

- Computer ScienceIEEE Transactions on Information Theory
- 2020

This work shows that the low complexity approximate message-passing algorithm is optimal outside of the so-called hard phase, in the sense that it asymptotically reaches the minimal-mean-square error, and proves two important features of spatially coupled noisy linear random Gaussian estimation.

### An Improved Analysis of Least Squares Superposition Codes with Bernoulli Dictionary

- Computer ScienceJapanese Journal of Statistics and Data Science
- 2019

An improved upper bound on the block error probability with least squares decoding of sparse superposition codes is shown, which is fairly simplified and tighter bound than the previous result in 2014.

### Unsourced Random Access With Coded Compressed Sensing: Integrating AMP and Belief Propagation

- Computer ScienceIEEE Transactions on Information Theory
- 2022

This article introduces a novel framework where the inner AMP decoder and the outer decoder operate in tandem, dynamically passing information back and forth to take full advantage of the underlying CCS structure.

### Compressed-Coding and Analog Spatial-Coupling using AMP based Decoding

- Computer ScienceGLOBECOM 2020 - 2020 IEEE Global Communications Conference
- 2020

The state evolution analysis of AMP is derived and it is shown that compressed-coding can approach Gaussian capacity at a very low compression ratio, and the results are extended to systems involving non-linear effects such as clipping.

## References

SHOWING 1-10 OF 127 REFERENCES

### Sparse regression codes for multi-terminal source and channel coding

- Computer Science2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2012

With minimum-distance encoding/decoding it is shown that sparse regression codes attain the optimal information-theoretic limits for a variety of multiterminal source and channel coding problems.

### Universal Sparse Superposition Codes With Spatial Coupling and GAMP Decoding

- Computer ScienceIEEE Transactions on Information Theory
- 2019

It is argued that spatially coupled sparse superposition codes universally achieve capacity under GAMP decoding by showing, through analytical computations, that the error floor vanishes and the potential threshold tends to capacity, as one of the code parameters goes to infinity.

### Techniques for Improving the Finite Length Performance of Sparse Superposition Codes

- Computer ScienceIEEE Transactions on Communications
- 2018

A variety of techniques are described to improve sparse superposition codes with approximate message passing (AMP) decoding, and these include an iterative algorithm for SPARC power allocation, guidelines for choosing codebook parameters, and estimating a critical decoding parameter online instead of precomputation.

### Capacity-achieving sparse regression codes via spatial coupling

- Computer Science2018 IEEE Information Theory Workshop (ITW)
- 2018

Non-asymptotic bounds are obtained for the state evolution parameters of SC-SPARCs, which are used to describe the decoding progression in terms of the code parameters and the gap from capacity.

### Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding

- Computer ScienceIEEE Transactions on Information Theory
- 2017

An approximate message passing decoder for sparse superposition codes, whose decoding complexity scales linearly with the size of the design matrix, is proposed and it is shown to asymptotically achieve the AWGN capacity with an appropriate power allocation.

### Approximate Message-Passing Decoder and Capacity Achieving Sparse Superposition Codes

- Computer ScienceIEEE Transactions on Information Theory
- 2017

Simulations suggest that spatial coupling is more robust and allows for better reconstruction at finite code lengths, and it is shown empirically that the use of a fast Hadamard-based operator allows for an efficient reconstruction, both in terms of computational time and memory, and the ability to deal with very large messages.

### The Error Probability of Sparse Superposition Codes With Approximate Message Passing Decoding

- Computer ScienceIEEE Transactions on Information Theory
- 2019

A large deviation bound on the probability of an AMP decoding error is derived, giving insight into the error performance of the AMP decoder for large but finite problem sizes, giving an error exponent as well as guidance on how the code parameters should be chosen at finite block lengths.

### Lattice Coding for Signals and Networks: A Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory

- Computer Science
- 2014

It is shown how high dimensional lattice codes can close the gap to the optimal information theoretic solution, including the characterisation of error exponents, when generalising the framework to Gaussian networks.

### The Rate-Distortion Function and Error Exponent of Sparse Regression Codes with Optimal Encoding

- Computer Science
- 2014

The proof of the rate-distortion result is based on the second moment method, a popular technique to show that a non-negative random variable $X$ is strictly positive with high probability, and the refinement technique is applied to Suen's correlation inequality to prove the achievability of the optimal Gaussian error exponent.

### Proof of threshold saturation for spatially coupled sparse superposition codes

- Computer Science2016 IEEE International Symposium on Information Theory (ISIT)
- 2016

It is proved that state evolution (which tracks message passing) indeed saturates the potential threshold of the underlying code ensemble, which approaches in a proper limit the optimal threshold.