# Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes

@article{Cao2021UsingLD,
title={Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes},
author={Haiwen Cao and Pascal O. Vontobel},
journal={IEEE Transactions on Communications},
year={2021},
volume={69},
pages={4282-4293}
}
• Published 31 October 2020
• Computer Science
• IEEE Transactions on Communications
We consider sparse regression codes (SPARCs) over complex AWGN channels. Such codes can be efficiently decoded by an approximate message passing (AMP) decoder, whose performance can be predicted via so-called state evolution in the large-system limit. In this paper, we mainly focus on how to use concatenation of SPARCs and cyclic redundancy check (CRC) codes on the encoding side and use list decoding on the decoding side to improve the finite-length performance of the AMP decoder for SPARCs…
1 Citations

## Figures from this paper

### Using List Decoding to Improve the Finite-Length Performance of Sparse Regression Codes

• Computer Science
2020 IEEE Information Theory Workshop (ITW)
• 2021
This paper focuses on how to use concatenation of SPARCs and cyclic redundancy check (CRC) codes on the encoding side and use list decoding on the decoding side to improve the finite-length performance of the AMP decoder forSPARCs over complex AWGN channels.

## References

SHOWING 1-10 OF 19 REFERENCES

### Techniques for Improving the Finite Length Performance of Sparse Superposition Codes

• Computer Science
IEEE Transactions on Communications
• 2018
A variety of techniques are described to improve sparse superposition codes with approximate message passing (AMP) decoding, and these include an iterative algorithm for SPARC power allocation, guidelines for choosing codebook parameters, and estimating a critical decoding parameter online instead of precomputation.

### Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding

• Computer Science
IEEE Transactions on Information Theory
• 2017
An approximate message passing decoder for sparse superposition codes, whose decoding complexity scales linearly with the size of the design matrix, is proposed and it is shown to asymptotically achieve the AWGN capacity with an appropriate power allocation.

### The Error Probability of Sparse Superposition Codes With Approximate Message Passing Decoding

• Computer Science
IEEE Transactions on Information Theory
• 2019
A large deviation bound on the probability of an AMP decoding error is derived, giving insight into the error performance of the AMP decoder for large but finite problem sizes, giving an error exponent as well as guidance on how the code parameters should be chosen at finite block lengths.

### Modulated Sparse Superposition Codes for the Complex AWGN Channel

• Computer Science
IEEE Transactions on Information Theory
• 2021
The results show that introducing modulation to the SPARC design can significantly reduce decoding complexity without sacrificing error performance, and asymptotically capacity achieving for the complex AWGN channel.

### Lossy compression via sparse linear regression: Computationally efficient encoding and decoding

• Computer Science
2013 IEEE International Symposium on Information Theory
• 2013
The Sparse Regression Code is robust in the following sense: for any ergodic source, the proposed encoder achieves the optimal distortion-rate function of an i.i.d Gaussian source with the same variance.

### Lossy Compression via Sparse Linear Regression: Performance Under Minimum-Distance Encoding

• Computer Science
IEEE Transactions on Information Theory
• 2014
A new class of codes for lossy compression with the squared-error distortion criterion, designed using the statistical framework of high-dimensional linear regression, is studied, showing that such a code can attain the Shannon rate-distortion function with the optimal error exponent.

### Spatially Coupled Sparse Regression Codes: Design and State Evolution Analysis

• Computer Science
2018 IEEE International Symposium on Information Theory (ISIT)
• 2018
An asymptotic characterization of the state evolution equations for SC-SPARCs is given and it is shown that AMP decoding succeeds in the large system limit for all rates $R < \mathcal{C}$.

### Sparse regression codes for multi-terminal source and channel coding

• Computer Science
2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
• 2012
With minimum-distance encoding/decoding it is shown that sparse regression codes attain the optimal information-theoretic limits for a variety of multiterminal source and channel coding problems.

### Low-density parity-check codes

A simple but nonoptimum decoding scheme operating directly from the channel a posteriori probabilities is described and the probability of error using this decoder on a binary symmetric channel is shown to decrease at least exponentially with a root of the block length.

### Fast Sparse Superposition Codes Have Near Exponential Error Probability for $R<{\cal C}$

• Computer Science
IEEE Transactions on Information Theory
• 2014
Here, a fast decoding algorithm, called the adaptive successive decoder, is developed, and for any rate R less than the capacity C, communication is shown to be reliable with nearly exponentially small error probability.