Thomas M. Cover

Learn More
The case of n unity-variance random variables x1, XZ,. * *, x, governed by the joint probability density w(xl, xz, * * * x,) is considered, where the density depends on the (normalized) cross-covariances pii = E[(xi jzi)(xi li)]. It is shown that the condition holds for an “arbitrary” function f(xl, x2, * * * , x,) of n variables if and only if the(More)
A relay channel consists of an input x,, a relay output yl, a cJmnnel output y, and a relay sender x2 (whose trasmission is allowed to depend on the past symbols y,). l%e dependence of the received symbols upm the inpnts is given by p(y,y,lx,,x,). ‘l%e channel is assumed to be memoryless. In this paper the following capacity theorems are proved. 1)(More)
This paper develops the separating capacities of families of nonlinear decision surfaces by a direct application of a theorem in classical combinatorial geometry. It is shown that a family of surfaces having d degrees of freedom has a natural separating capacity of 2d pattern vectors, thus extending and unifying results of Winder and others on the(More)
We introduce the problem of a single source attempting to communicate information simultaneously to several receivers. The intent is to model the situation of a broadcaster with multiple receivers or a lecturer with many listeners. Thus several different channels with a common input a lphabet are specified. W e shall determine the families of simultaneously(More)
The role of inequalities in information theory is reviewed and the relationship of these inequalities to inequalities in other branches of mathematics is developed. Index Terms -Information inequalities, entropy power, Fisher information, uncertainty principles. I. PREFACE:~NEQUALITIES ININFORMATIONTHEORY I NEQUALITIES in information theory have been driven(More)
The maximum entropy noise under a lag autocorrelation constraint is known by Burg’s theorem to be the th order Gauss–Markov process satisfying these constraints. The question is, what is the worst additive noise for a communication channel given these constraints? Is it the maximum entropy noise? The problem becomes one of extremizing the mutual information(More)
We exhibit an algorithm for portfolio selection that asymptotically outperforms the best stock in the market. Let xi = (xi1; xi2; : : : ; xim) t denote the performance of the stock market on day i ; where xij is the factor by which the j-th stock increases on day i : Let bi = (bi1;bi2; : : : ;bim) t ;bij 0; P j bij = 1 ; denote the proportion bij of wealth(More)
Abslruct-The successive refinement of information consists of first approximating data using a few hits of information, then iteratively improving the approximation as more and more information is supplied. The goal is to achieve an optimal description at each stage. In general an ongoing description is sought which is rate-distortion optimal whenever it is(More)