Novel tight classification error bounds under mismatch conditions based on f-Divergence

@article{Schlter2013NovelTC,
  title={Novel tight classification error bounds under mismatch conditions based on f-Divergence},
  author={Ralf Schl{\"u}ter and Markus Nu\ssbaum-Thom and Eugen Beck and Tamer Alkhouli and Hermann Ney},
  journal={2013 IEEE Information Theory Workshop (ITW)},
  year={2013},
  pages={1-5}
}
By default, statistical classification/multiple hypothesis testing is faced with the model mismatch introduced by replacing the true distributions in Bayes decision rule by model distributions estimated on training samples. Although a large number of statistical measures exist w.r.t. to the mismatch introduced, these works rarely relate to the mismatch in accuracy, i.e. the difference between model error and Bayes error. In this work, the accuracy mismatch between the ideal Bayes decision rule… CONTINUE READING

From This Paper

Figures, tables, results, connections, and topics extracted from this paper.
5 Extracted Citations
10 Extracted References
Similar Papers

Referenced Papers

Publications referenced by this paper.
Showing 1-10 of 10 references

Csizár’s f-Divergences - Basic Properties”, Talk presented at workshop of the Research Group in Mathematical Inequalities and Applications at the Victoria

  • F. Österreicher
  • 2002
Highly Influential
4 Excerpts

Converse of Jensens inequality for convex functions”, Publications de la faculte d’Electrotechnique de Universite a Belgrade, ser

  • P. Lah, M. Ribarič
  • Mathematics et Physique,
  • 1973

Similar Papers

Loading similar papers…