1319 with, for example, neural nets. The same numerical search methods are applicable for both of these model structures. REFERENCES  L. Breiman, " Hinging hyperplanes for regression, classification and function approximation, " IEEE Trans. Efficient agnostic learning of neural networks with bounded fan-in, " IEEE Trans. A simple lemma on greedy… (More)
We analyse an analog of the entropy-power inequality for the weighted entropy. In particular , we discuss connections with weighted Lieb's splitting inequality and an Gaussian additive noise formula. Examples and counterexamples are given, for some classes of probability distributions.
A number of inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities. 1 The weighted Gibbs inequality and its consequences along with a number of theoretical suggestions. The purpose of this note is to extend a number of inequalities for a standard (Shannon) entropy to the case of the… (More)
We produce a series of results extending information-theoretical inequalities (discussed by Dembo–Cover–Thomas in 1989-1991) to a weighted version of entropy. The resulting inequalities involve the Gaussian weighted entropy; they imply a number of new relations for determinants of positive-definite matrices.
A number of simple inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities.
The aim of this paper is to analyze the weighted KyFan inequality proposed in . A number of numerical simulations involving the exponential weighted function is given. We show that in several cases and types of examples one can imply an improvement of the standard KyFan inequality.
We generalize the weighted cumulative entropies (WCRE and WCE), introduced in , for a system or component lifetime. Representing properties of cumulative entropies, several bounds and inequalities for the WCRE is proposed.
This article addresses the issue of the proof of the entropy power inequality (EPI), an important tool in the analysis of Gaussian channels of information transmission, proposed by Shannon. We analyse continuity properties of the mutual entropy of the input and output signals in an additive memoryless channel and discuss assumptions under which the… (More)