On an Improvement over Rényi's Equivocation Bound

Abstract

— We consider the problem of estimating the probability of error in multi-hypothesis testing when MAP criterion is used. This probability, which is also known as the Bayes risk is an important measure in many communication and information theory problems. In general, the exact Bayes risk can be difficult to obtain. Many upper and lower bounds are known in literature. One such upper bound is the equivocation bound due to Rényi which is of great philosophical interest because it connects the Bayes risk to conditional entropy. Here we give a simple derivation for an improved equivocation bound. We then give some typical examples of problems where these bounds can be of use. We first consider a binary hypothesis testing problem for which the exact Bayes risk is difficult to derive. In such problems bounds are of interest. Furthermore using the bounds on Bayes risk derived in the paper and a random coding argument, we prove a lower bound on equivocation valid for most random codes over memoryless channels. I. INTRODUCTION In his celebrated paper of 1948, Shannon proved the Channel Coding Theorem. This theorem essentially states that the ensemble of long random block codes (and thus some specific code) in the limit of very large block lengths, achieves an arbitrarily low probability of error under decoding by jointly typical decision rule, when used over a given channel at information rates below a limit called the channel's Shannon capacity. It is well known that for minimizing the Bayes risk, the optimal decision rule is the Maximum´Aposteriori Probability (MAP) decision rule. Shannon uses jointly typical decision rule in his analysis because, asymptotically the decision rule is optimal and it simplifies the analysis considerably. The strong converse to the channel coding theorem based on Fano's inequality states that the probability of error under any decision rule approaches 1 exponentially as block length increases when rate is above capacity.

Extracted Key Phrases

4 Figures and Tables

Showing 1-10 of 12 references

The Theory of Information and Coding. Encyclopedia of Mathematics

  • R J Mceliece
  • 2002

Feature evaluation criteria and contextual decoding algorithms in statistical pattern recognition

  • G T Toussaint
  • 1972

An Introduction to Probability Theory and Its Applications, volume II

  • W Feller
  • 1970

Bounds of the minimal error probability on checking a finite or countable number of hypothesis Information Transmission Problems

  • I Vajda
  • 1968

The divergence and Bhattacharyya distance in signal selection

  • T Kailath
  • 1967

On the amount of missing information and the Neyman–Pearson Lemma

  • A R Enyi
  • 1966