• Corpus ID: 14543652

Chernoff information of exponential families

@article{Nielsen2011ChernoffIO,
  title={Chernoff information of exponential families},
  author={Frank Nielsen},
  journal={ArXiv},
  year={2011},
  volume={abs/1102.2684}
}
  • F. Nielsen
  • Published 14 February 2011
  • Computer Science
  • ArXiv
Chernoff information upper bounds the probability of error of the optimal Bayesian decision rule for $2$-class classification problems. However, it turns out that in practice the Chernoff bound is hard to calculate or even approximate. In statistics, many usual distributions, such as Gaussians, Poissons or frequency histograms called multinomials, can be handled in the unified framework of exponential families. In this note, we prove that the Chernoff information for members of the same… 

Figures from this paper

Estimating Mixture Entropy with Pairwise Distances
TLDR
A family of estimators based on a pairwise distance function between mixture components, and it is proved that this estimator class has many attractive properties, is very useful in optimization problems involving maximization/minimization of entropy and mutual information, such as MaxEnt and rate distortion problems.
Thermodynamic assessment of probability distribution divergencies and Bayesian model comparison
TLDR
Within path sampling framework, it is shown that probability distribution divergences, such as the Chernoff information, can be estimated via thermodynamic integration and a geometric approach is feasible, which prompts intuition and facilitates tuning the error sources.
Information geometry metric for random signal detection in large random sensing systems
  • R. Boyer, F. Nielsen
  • Computer Science, Mathematics
    2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
TLDR
This work proposes a closed-form expression of the asymptotic normalized s-divergence to provide an analytic expression for the optimal value of s, the minimal Bayes' error probability for the detection of s.
A Family of Bounded Divergence Measures Based on The Bhattacharyya Coefficient
TLDR
It is shown that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information, and certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence are derived.
Information geometric probability models in statistical signal processing
TLDR
It is shown that finding the extended Chernoff point can be guided by considering the orientation of the component PDFs, and that the use of this paradigm can lead to better ways to combine estimates in classical problems such as combining estimates of common means from separate Normal populations.
Some Results on Generalized Ellipsoid Intersection Fusion
TLDR
Numerical examples demonstrate that the generalized ellipsoid intersection method has lower Shannon information and higher Chernoff information than the generalized covariance intersection method.
Log-linear Chernoff Fusion for Distributed Particle Filtering
TLDR
This work provides an account of known techniques for tuning of the fusion parameters, and suggests a new one based on the optimization of Chernoff information, which shows a decisive advantage for the new technique both in terms of estimation accuracy and computational load.
On w-mixtures: Finite convex combinations of prescribed component distributions
TLDR
It is shown how the Kullback-Leibler (KL) divergence can be recovered from the corresponding Bregman divergence for the negentropy generator and proved that the statistical skew Jensen-Shannon divergence between $w-mixtures is equivalent to a skew Jensen divergence between their corresponding parameters.
of the Bernoulli Society for Mathematical Statistics and Probability Volume Twenty Eight Number Two May 2022
A list of forthcoming papers can be found online at http://www.bernoullisociety.org/index. php/publications/bernoulli-journal/bernoulli-journal-papers CONTENTS 713 BELLEC, P.C. and ZHANG, C.-H.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 23 REFERENCES
Probabilistic distance measures of the Dirichlet and Beta distributions
Statistical exponential families: A digest with flash cards
TLDR
This document describes concisely the ubiquitous class of exponential family distributions met in statistics and recalls the Fisher-Rao-Riemannian geometries and the dual affine connection information geometry of statistical manifolds.
Nearest neighbor pattern classification
TLDR
The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.
The Burbea-Rao and Bhattacharyya Centroids
TLDR
An efficient algorithm for computing the Bhattacharyya centroid of a set of parametric distributions belonging to the same exponential families, improving over former specialized methods found in the literature that were limited to univariate or “diagonal” multivariate Gaussians.
Arbitrarily Tight Upper and Lower Bounds on the Bayesian Probability of Error
TLDR
New upper and lower bounds on the minimum probability of error of Bayesian decision systems for the two-class problem are presented, making them tighter than any previously known bounds.
The Divergence and Bhattacharyya Distance Measures in Signal Selection
TLDR
This partly tutorial paper compares the properties of an often used measure, the divergence, with a new measure that is often easier to evaluate, called the Bhattacharyya distance, which gives results that are at least as good and often better than those given by the divergence.
Bregman Voronoi Diagrams
TLDR
A framework for defining and building Voronoi diagrams for a broad class of distance functions called Bregman divergences, which allow one to define information-theoretic Vor onoi diagrams in statistical parametric spaces based on the relative entropy of distributions.
Symmetrizing the Kullback-Leibler Distance
TLDR
A new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance is defined and its relation to well-known distance measures is determined.
Families of Alpha- Beta- and Gamma- Divergences: Flexible and Robust Measures of Similarities
TLDR
It is shown that a new wide class of Gamma-divergences can be generated not only from the family of Beta-diversgences but also from a family of Alpha-d divergences.
Statistical Edge Detection: Learning and Evaluating Edge Cues
TLDR
This work uses presegmented images to learn the probability distributions of filter responses conditioned on whether they are evaluated on or off an edge, and evaluates the effectiveness of different visual cues using the Chernoff information and Receiver Operator Characteristic (ROC) curves.
...
1
2
3
...