#### Filter Results:

#### Publication Year

1990

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

{ We discuss a family of estimators for the entropy rate of a stationary ergodic process and prove their pointwise and mean consistency under a Doeblin-type mixing condition. The estimators are Cess aro averages of longest match-lengths, and their consistency follows from a generalized ergodic theorem due to Maker. We provide examples of their performance… (More)

We analyse an analog of the entropy-power inequality for the weighted entropy. In particular , we discuss connections with weighted Lieb's splitting inequality and an Gaussian additive noise formula. Examples and counterexamples are given, for some classes of probability distributions.

A number of inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities. 1 The weighted Gibbs inequality and its consequences along with a number of theoretical suggestions. The purpose of this note is to extend a number of inequalities for a standard (Shannon) entropy to the case of the… (More)

We produce a series of results extending information-theoretical inequalities (discussed by Dembo–Cover–Thomas in 1989-1991) to a weighted version of entropy. The resulting inequalities involve the Gaussian weighted entropy; they imply a number of new relations for determinants of positive-definite matrices.

A number of simple inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities.

We generalize the weighted cumulative entropies (WCRE and WCE), introduced in [5], for a system or component lifetime. Representing properties of cumulative entropies, several bounds and inequalities for the WCRE is proposed.

This article addresses the issue of the proof of the entropy power inequality (EPI), an important tool in the analysis of Gaussian channels of information transmission, proposed by Shannon. We analyse continuity properties of the mutual entropy of the input and output signals in an additive memoryless channel and discuss assumptions under which the… (More)

In this paper, we review Fisher information matrices properties in weighted version and discuss inequalities/bounds on it by using reduced weight functions. In particular, an extended form of the Fisher information inequality previously established in [6] is given. Further, along with generalized De-Bruijn's identity, we provide new interpretation of the… (More)

The weighted entropy $H^{\rm w}_\phi (X)=H^{\rm w}_\phi (f)$ of a random variable $X$ with values $x$ and a probability-mass/density function $f$ is defined as the mean value ${\mathbb E} I^{\rm w}_\phi(X)$ of the weighted information $I^{\rm w}_\phi (x)=-\phi (x)\log\,f(x)$. Here $x\mapsto\phi (x)\in{\mathbb R}$ is a given weight function (WF) indicating a… (More)