Divergences and Risks for Multiclass Experiments

@inproceedings{GarcaGarca2012DivergencesAR,
  title={Divergences and Risks for Multiclass Experiments},
  author={Dario Garc{\'i}a-Garc{\'i}a and Robert C. Williamson},
  booktitle={COLT},
  year={2012}
}
Csiszár’s f -divergence is a way to measure the similarity of two probability distributions. We study the extension of f -divergence to more than two distributions to measure their joint similarity. By exploiting classical results from the comparison of experiments literature we prove the resulting divergence satisfies all the same properties as the traditional binary one. Considering the multidistribution case actually makes the proofs simpler. The key to these results is a formal bridge… CONTINUE READING
Highly Cited
This paper has 18 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Explore Further: Topics Discussed in This Paper

Citations

Publications citing this paper.

References

Publications referenced by this paper.
Showing 1-10 of 53 references

Information Distance and Its Extensions

Discovery Science • 2011
View 4 Excerpts
Highly Influenced

On Divergences and Informations in Statistics and Information Theory

IEEE Transactions on Information Theory • 2006
View 4 Excerpts
Highly Influenced

dissimilarity: A general class of separation measures of several probability measures

László Györfi, Tibor Nemetz
Topics in Information Theory, volume 16 of Colloquia Mathematica Societatis János Bolyai, • 1975
View 3 Excerpts
Highly Influenced

B . Vitányi . Information distance in multiples

M. Paul
IEEE Transactions on Information Theory • 2011

Data Processing Theorems and the Second Law of Thermodynamics

IEEE Transactions on Information Theory • 2011
View 1 Excerpt

Information Distance in Multiples

IEEE Transactions on Information Theory • 2011
View 2 Excerpts

Similar Papers

Loading similar papers…