Maximally Informative Statistics

Abstract

In this paper we propose a Bayesian, information theoretic approach to dimensionality reduction. The approach is formulated as a variational principle on mutual information, and seamlessly addresses the notions of sufficiency, relevance, and representation. Maximally informative statistics are shown to minimize a Kullback-Leibler distance between posterior distributions. Illustrating the approach, we derive the maximally informative one dimensional statistic for a random sample from the Cauchy distribution.

Cite this paper

@inproceedings{Wolf2000MaximallyIS, title={Maximally Informative Statistics}, author={David R. Wolf}, year={2000} }