- Published 2014 in STOC

We study the problem of computing max-entropy distributions over a discrete set of objects subject to observed marginals. There has been a tremendous amount of interest in such distributions due to their applicability in areas such as statistical physics, economics, biology, information theory, machine learning, combinatorics and algorithms. However, a rigorous and systematic study of how to compute such distributions has been lacking. Since the underlying set of discrete objects can be exponential in the input size, the first question in such a study is if max-entropy distributions have polynomially-sized descriptions. We start by giving a structural result which shows that such succinct descriptions exist under very general conditions. Subsequently, we use techniques from convex programming to give a meta-algorithm that can efficiently (approximately) compute max-entropy distributions <i>provided one can efficiently (approximately) count the underlying discrete set</i>. Thus, we can translate a host of existing counting algorithms, developed in an unrelated context, into algorithms that compute max-entropy distributions. Conversely, we prove that counting oracles are <i>necessary</i> for computing max-entropy distributions: we show how algorithms that compute max-entropy distributions can be converted into counting algorithms.

@inproceedings{Singh2014EntropyOA,
title={Entropy, optimization and counting},
author={Mohit Singh and Nisheeth K. Vishnoi},
booktitle={STOC},
year={2014}
}