# Chernoff information of exponential families

@article{Nielsen2011ChernoffIO, title={Chernoff information of exponential families}, author={Frank Nielsen}, journal={ArXiv}, year={2011}, volume={abs/1102.2684} }

Chernoff information upper bounds the probability of error of the optimal Bayesian decision rule for $2$-class classification problems. However, it turns out that in practice the Chernoff bound is hard to calculate or even approximate. In statistics, many usual distributions, such as Gaussians, Poissons or frequency histograms called multinomials, can be handled in the unified framework of exponential families. In this note, we prove that the Chernoff information for members of the same…

## 51 Citations

Estimating Mixture Entropy with Pairwise Distances

- Computer ScienceEntropy
- 2017

A family of estimators based on a pairwise distance function between mixture components, and it is proved that this estimator class has many attractive properties, is very useful in optimization problems involving maximization/minimization of entropy and mutual information, such as MaxEnt and rate distortion problems.

Thermodynamic assessment of probability distribution divergencies and Bayesian model comparison

- Computer Science
- 2013

Within path sampling framework, it is shown that probability distribution divergences, such as the Chernoff information, can be estimated via thermodynamic integration and a geometric approach is feasible, which prompts intuition and facilitates tuning the error sources.

Information geometry metric for random signal detection in large random sensing systems

- Computer Science, Mathematics2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2017

This work proposes a closed-form expression of the asymptotic normalized s-divergence to provide an analytic expression for the optimal value of s, the minimal Bayes' error probability for the detection of s.

A Family of Bounded Divergence Measures Based on The Bhattacharyya Coefficient

- Mathematics, Computer Science
- 2012

It is shown that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information, and certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence are derived.

Information geometric probability models in statistical signal processing

- Computer Science
- 2016

It is shown that finding the extended Chernoff point can be guided by considering the orientation of the component PDFs, and that the use of this paradigm can lead to better ways to combine estimates in classical problems such as combining estimates of common means from separate Normal populations.

Some Results on Generalized Ellipsoid Intersection Fusion

- Mathematics2019 22th International Conference on Information Fusion (FUSION)
- 2019

Numerical examples demonstrate that the generalized ellipsoid intersection method has lower Shannon information and higher Chernoff information than the generalized covariance intersection method.

Log-linear Chernoff Fusion for Distributed Particle Filtering

- Computer Science2019 22th International Conference on Information Fusion (FUSION)
- 2019

This work provides an account of known techniques for tuning of the fusion parameters, and suggests a new one based on the optimization of Chernoff information, which shows a decisive advantage for the new technique both in terms of estimation accuracy and computational load.

On w-mixtures: Finite convex combinations of prescribed component distributions

- Computer ScienceArXiv
- 2017

It is shown how the Kullback-Leibler (KL) divergence can be recovered from the corresponding Bregman divergence for the negentropy generator and proved that the statistical skew Jensen-Shannon divergence between $w-mixtures is equivalent to a skew Jensen divergence between their corresponding parameters.

Approaches to Chernoff fusion with applications to distributed estimation

- Computer ScienceDigit. Signal Process.
- 2020

of the Bernoulli Society for Mathematical Statistics and Probability Volume Twenty Eight Number Two May 2022

- Mathematics
- 2020

A list of forthcoming papers can be found online at http://www.bernoullisociety.org/index. php/publications/bernoulli-journal/bernoulli-journal-papers CONTENTS 713 BELLEC, P.C. and ZHANG, C.-H.…

## References

SHOWING 1-10 OF 23 REFERENCES

Probabilistic distance measures of the Dirichlet and Beta distributions

- MathematicsPattern Recognit.
- 2008

Statistical exponential families: A digest with flash cards

- Computer ScienceArXiv
- 2009

This document describes concisely the ubiquitous class of exponential family distributions met in statistics and recalls the Fisher-Rao-Riemannian geometries and the dual affine connection information geometry of statistical manifolds.

Nearest neighbor pattern classification

- Computer Science, MathematicsIEEE Trans. Inf. Theory
- 1967

The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.

The Burbea-Rao and Bhattacharyya Centroids

- Computer ScienceIEEE Transactions on Information Theory
- 2011

An efficient algorithm for computing the Bhattacharyya centroid of a set of parametric distributions belonging to the same exponential families, improving over former specialized methods found in the literature that were limited to univariate or “diagonal” multivariate Gaussians.

Arbitrarily Tight Upper and Lower Bounds on the Bayesian Probability of Error

- Computer Science, MathematicsIEEE Trans. Pattern Anal. Mach. Intell.
- 1996

New upper and lower bounds on the minimum probability of error of Bayesian decision systems for the two-class problem are presented, making them tighter than any previously known bounds.

The Divergence and Bhattacharyya Distance Measures in Signal Selection

- Computer Science
- 1967

This partly tutorial paper compares the properties of an often used measure, the divergence, with a new measure that is often easier to evaluate, called the Bhattacharyya distance, which gives results that are at least as good and often better than those given by the divergence.

Bregman Voronoi Diagrams

- Computer ScienceSODA '07
- 2007

A framework for defining and building Voronoi diagrams for a broad class of distance functions called Bregman divergences, which allow one to define information-theoretic Vor onoi diagrams in statistical parametric spaces based on the relative entropy of distributions.

Symmetrizing the Kullback-Leibler Distance

- Computer Science
- 2001

A new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance is defined and its relation to well-known distance measures is determined.

Families of Alpha- Beta- and Gamma- Divergences: Flexible and Robust Measures of Similarities

- Computer ScienceEntropy
- 2010

It is shown that a new wide class of Gamma-divergences can be generated not only from the family of Beta-diversgences but also from a family of Alpha-d divergences.

Statistical Edge Detection: Learning and Evaluating Edge Cues

- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 2003

This work uses presegmented images to learn the probability distributions of filter responses conditioned on whether they are evaluated on or off an edge, and evaluates the effectiveness of different visual cues using the Chernoff information and Receiver Operator Characteristic (ROC) curves.