f-Entropies, probability of Error, and Feature Selection

@article{BenBassat1978fEntropiesPO,
  title={f-Entropies, probability of Error, and Feature Selection},
  author={Moshe Ben-Bassat},
  journal={Information and Control},
  year={1978},
  volume={39},
  pages={227-242}
}
The f-entropy family of information measures: u(pl ..... Pro) = Ef (Pk) , f concave (e.g., Shannon (1948) Bell Syst. Tech. J. 27, 379-423, 623-656; Suadratic; Daroczy (1970) Inform. Contr. 16, 36-51; etc.), is considered. Characterization of the tightest upper and lower bounds on f-entropies by means of the probability of error, is presented. These bounds are used to derive the dual bounds, i.e., the tightest lower and upper bounds on the probability of error by means off-entropies. Concerning… CONTINUE READING

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.

References

Publications referenced by this paper.
Showing 1-10 of 22 references

Probability of error and equivocation of order ~

  • G. T. TOUSSAINT
  • IEEE Trans. Inform. Theory,
  • 1978
1 Excerpt

A generalization of Shannon's Equivocation and the Fano bound

  • G. T. TOUSSAINT
  • IEEE Trans. Systems, Man Cybernetics
  • 1977

Renyi's entropy, its properties and use in pattern recognition, presented at the Workshop on Pattern Recognition and Artificial Intelligence, Hyannis

  • M. BEN-BASSAT, J. RAVIV
  • 1976
1 Excerpt

On information transmission, nonparametric classification and measuring dependence between random variables, in "Proceedings of the Symposium on Statistics and Related Topics,

  • G. T. TOUSSAINT
  • 1974

Approximate error bounds in pattern recognition, in "Machine Intelligence,

  • T. ITo
  • 1972

Theoretical comparison of a class of feature selection criteria in pattern recognition

  • C. H. CHEN
  • IEEE Trans. Comput
  • 1971
1 Excerpt

Similar Papers

Loading similar papers…