Two simple stopping criteria for turbo decoding

@article{Shao1999TwoSS,
  title={Two simple stopping criteria for turbo decoding},
  author={Rose Y. Shao and Shu Lin and Marc P. C. Fossorier},
  journal={IEEE Trans. Commun.},
  year={1999},
  volume={47},
  pages={1117-1120}
}
This paper presents two simple and effective criteria for stopping the iteration process in turbo decoding with a negligible degradation of the error performance. Both criteria are devised based on the cross-entropy (CE) concept. They are as efficient as the CE criterion, but require much less and simpler computations. 

Figures from this paper

Simple stopping criterion for turbo decoding
TLDR
The new, improved hard-decision-aided (IHDA) extends the existing hard- Decoder (HDA), which requires no extra data storage, while achieving similar performance in terms of the BER and average number of iterations.
New stopping criteria for turbo decoder
  • Yimin Wei, Tao-li Yang, Yi Yao
  • Computer Science
    2011 Second International Conference on Mechanic Automation and Control Engineering
  • 2011
TLDR
In this new stopping criteria, using the coding constrain of turbo code and hard-decision-aided (HDA) algorithm together, the frame error rate and undetected frame error rates can be reduced.
Two Efficient Stopping Criteria for Iterative Decoding
  • Wei Jiang, Daoben Li
  • Computer Science
    2006 First International Conference on Communications and Networking in China
  • 2006
TLDR
The paper proposes two new stopping criteria for iterative decoding of turbo codes, based on the mutual information between the logarithm likelihood ratio and the data bits and the a posteriori error probability of the decoded information bits.
New error detection techniques and stopping criteria for turbo decoding
  • F. Zhai, I. Fair
  • Computer Science
    2000 Canadian Conference on Electrical and Computer Engineering. Conference Proceedings. Navigating to a New Era (Cat. No.00TH8492)
  • 2000
In this paper we combine techniques for early stopping and error detection in order to stop the iterative turbo code decoding process and detect if errors are present in the decoded bit sequence. The
Simple stopping criterion for min-sum iterative decoding algorithm
TLDR
The proposed stopping criterion is to check whether a decoded sequence is a valid codeword along the encoder trellis structure, which requires less computational complexity and saves memory.
A Novel Efficient Stopping Criterion for Turbo Codes
This paper proposes a novel stopping criterion based on the HDA stopping criterion. To devise the criterion, we consider both the HDA criterion and the mean of the absolute values of the
A New Stopping Criterion for Duo-binary Turbo Codes
TLDR
A novel stopping criterion for duo-binary turbo codes that was able to reduce the average iterative decoding number with a good error performance as compared to conventional criteria is described.
Bit-based SNR insensitive early stopping for turbo decoding
A simple yet effective bit-based early stopping technique by monitoring the evolution of log-likelihood ratio is proposed for turbo decoding. This method is insensitive to channel signal-to-noise
A Low-Complexity Stopping Criterion for Iterative Turbo Decoding
TLDR
An efficient and simple stopping criterion for turbo decoding, derived by observing the behavior of log-likelihood ratio (LLR) values, which achieves a reduced number of iterations while maintaining similar BER/FER performance to the Drevious criteria.
A new parity-check stopping criterion for turbo decoding
This paper presents a simple and effective stopping criterion for Turbo decoding. This criterion is based on parity-check scheme, which is totally different from the cross-entropy (CE) based
...
1
2
3
4
5
...

References

SHOWING 1-3 OF 3 REFERENCES
Decoding via cross-entropy minimization
  • M. Moher
  • Computer Science
    Proceedings of GLOBECOM '93. IEEE Global Telecommunications Conference
  • 1993
An intuitive algorithm by Lodge et al. [1992] for iterative decoding of block codes is shown to follow from entropy optimization principles. This approach provides a novel and effective algorithm for
Near optimum error correcting coding and decoding: turbo-codes
TLDR
A new family of convolutional codes, nicknamed turbo-codes, built from a particular concatenation of two recursive systematic codes, linked together by nonuniform interleaving appears to be close to the theoretical limit predicted by Shannon.
Iterative decoding of binary block and convolutional codes
TLDR
Using log-likelihood algebra, it is shown that any decoder can be used which accepts soft inputs-including a priori values-and delivers soft outputs that can be split into three terms: the soft channel and aPriori inputs, and the extrinsic value.