# Minimum Probability of Error of List M-ary Hypothesis Testing

@article{Kangarshahi2021MinimumPO, title={Minimum Probability of Error of List M-ary Hypothesis Testing}, author={Ehsan Asadi Kangarshahi and Albert Guill{\'e}n i F{\`a}bregas}, journal={ArXiv}, year={2021}, volume={abs/2110.14608} }

We study a variation of Bayesian M -ary hypothesis testing in which the test outputs a list of L candidates out of the M possible upon processing the observation. We study the minimum error probability of list hypothesis testing, where an error is defined as the event where the true hypothesis is not in the list output by the test. We derive two exact expressions of the minimum probability or error. The first is expressed as the error probability of a certain non-Bayesian binary hypothesis test…

## References

SHOWING 1-10 OF 10 REFERENCES

Bayesian $M$ -Ary Hypothesis Testing: The Meta-Converse and Verdú-Han Bounds Are Tight

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2016

Two alternative exact characterizations of the minimum error probability of Bayesian M-ary hypothesis testing are derived and help to identify the steps where existing converse bounds are loose.

On the Problem of the Most Efficient Tests of Statistical Hypotheses

- Mathematics
- 1933

The problem of testing statistical hypotheses is an old one. Its origins are usually connected with the name of Thomas Bayes, who gave the well-known theorem on the probabilities a posteriori of the…

List decoding for noisy channels

- Computer Science
- 1957

This paper investigates the relationship between upper and lower bounds and error probability for a modified decoding procedure, in which the receiver lists L messages, rather than one, after reception, which implies that for large L, the average of all codes is almost as good as the best code.

An Introductory Guide to Fano's Inequality with Applications in Statistical Estimation

- Computer Science, MathematicsArXiv
- 2019

This chapter provides a survey of Fano's inequality and its variants in the context of statistical estimation, adopting a versatile framework that covers a wide range of specific problems.

Testing Statistical Hypotheses

- Computer Science
- 2005

This classic textbook, now available from Springer, summarizes developments in the field of hypotheses testing. Optimality considerations continue to provide the organizing principle. However, they…

Soft–Input Soft–Output Single Tree-Search Sphere Decoding

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2010

A low-complexity SISO sphere-decoding algorithm, based on the single tree-search paradigm proposed originally for soft-output MIMO detection in Studer, is presented, which results in significant complexity savings and allows to cover a large performance/complexity tradeoff region by adjusting a single parameter.

Channel Coding Rate in the Finite Blocklength Regime

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2010

It is shown analytically that the maximal rate achievable with error probability ¿ isclosely approximated by C - ¿(V/n) Q-1(¿) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion, and Q is the complementary Gaussian cumulative distribution function.

Information-Spectrum Methods in Information Theory

- Computer Science
- 2002

This paper presents a meta-analyses of source and channel coding for multi-Terminal Information Theory, which aims to clarify the role of symbols in the development of information theory.

Information Theory

- Nature
- 1962

Information TheoryPapers read at a Symposium on Information Theory held at the Royal Institution, London, August 29th to September 2nd, 1960. Edited by Colin Cherry. Pp. xi + 476. (London:…

Information Theory, (draft) 2021

- 2021