We analyse an analog of the entropy-power inequality for the weighted entropy. In particular , we discuss connections with weighted Lieb's splitting inequality and an Gaussian additive noise formula. Examples and counterexamples are given, for some classes of probability distributions.
We produce a series of results extending information-theoretical inequalities (discussed by Dembo–Cover–Thomas in 1989-1991) to a weighted version of entropy. The resulting inequalities involve the Gaussian weighted entropy; they imply a number of new relations for determinants of positive-definite matrices.
A number of inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities. 1 The weighted Gibbs inequality and its consequences along with a number of theoretical suggestions. The purpose of this note is to extend a number of inequalities for a standard (Shannon) entropy to the case of the… (More)
A number of simple inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities.
In this paper the author analyzes the weighted Renyi entropy in order to derive several inequalities in weighted case. Furthermore, using the proposed notions α-th generalized deviation and (α, p)-th weighted Fisher information, extended versions of the moment-entropy, Fisher information and Cramér-Rao inequalities in terms of generalized Gaussian densities… (More)
We generalize the weighted cumulative entropies (WCRE and WCE), introduced in , for a system or component lifetime. Representing properties of cumulative entropies, several bounds and inequalities for the WCRE is proposed.
In this note the author uses order statistics to estimate WCRE and WCE in terms of empirical and survival functions. An example in both cases normal and exponential WFs is analyzed.
—We propose a direct estimation method for Rényi and f-divergence measures based on a new graph theoretical interpretation. Suppose that we are given two sample sets X and Y , respectively with N and M samples, where η := M/N is a constant value. Considering the k-nearest neighbor (k-NN) graph of Y in the joint data set (X, Y), we show that the average… (More)
In this paper, we review Fisher information matrices properties in weighted version and discuss inequalities/bounds on it by using reduced weight functions. In particular, an extended form of the Fisher information inequality previously established in  is given. Further, along with generalized De-Bruijn's identity, we provide new interpretation of the… (More)
Information theoretic measures (e.g. the Kullback Liebler divergence and Shannon mutual information) have been used for exploring possibly nonlinear multivariate dependencies in high dimension. If these dependencies are assumed to follow a Markov factor graph model, this exploration process is called structure discovery. For discrete-valued samples,… (More)