On Entropies, Divergences, and Mean Values

  title={On Entropies, Divergences, and Mean Values},
  author={Mich Ele Basseville and Jean-Fran Cois Cardoso and H HIsPos-ItiveOneCanThenConstructChMAXJ and Jensen Diierence Jh and Bregman Dis-Tance Dh and Joel Bregman}
  • Mich Ele Basseville, Jean-Fran Cois Cardoso, +3 authors Joel Bregman
| Two entropy-based divergence classes are compared using the associated quadratic diieren-tial metrics, mean values and projections. I. Two classes of divergences The design concepts of divergences are of interest because of the key role they play in statistical inference and signal processing. Most of the existing divergences D between two probability distributions may be associated with an integral or non integral entropy functional H with respect to some reference measure. We distinguish… CONTINUE READING
Highly Cited
This paper has 22 citations. REVIEW CITATIONS


Publications citing this paper.


Publications referenced by this paper.
Showing 1-10 of 10 references

Generalized cutoo rates and Rinyi's information measures

  • I Csisz, Ar
  • IEEE Trans. Information Theory
  • 1995

Diierential-Geometrical Methods in Statistics

  • S-I, Amari
  • Lecture Notes in Statistics Jal Math. Anal. Appl
  • 1985

Acz el and Z. Dar oczy, On Measures of Information and Their Characterizations

  • Acz el and Z. Dar oczy, On Measures of…
  • 1975

Information measures: a critical survey 7th Prague Conf 55 I. Csisz ar, Why least-squares and maximum entropy ? An axiomatic approach to inference for linear inverse problems

  • Csisz Ar
  • Inf.Th., Stat.Dec.Funct. and Rand.Proc. Annals…
  • 1974

Information radius

  • R Sibson
  • Z. Wahrscheinlichkeitsth. Werw. Gebiete
  • 1969

Diierential metrics in probability spaces On measures of entropy and information

  • R Rao
  • IMS Lecture Notes
  • 1961

Generalized projections for non-negative functions Acta Mathematica Hungarica, v ol

  • Csisz Ar
  • J.E. LittlewoodandG.Polya, Inequalities
  • 1952