#### Filter Results:

#### Publication Year

2009

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

We produce a series of results extending information-theoretical inequalities (discussed by Dembo–Cover–Thomas in 1989-1991) to a weighted version of entropy. The resulting inequalities involve the Gaussian weighted entropy; they imply a number of new relations for determinants of positive-definite matrices.

A number of simple inequalities for the weighted entropies is proposed, mirroring properties of a standard (Shannon) entropy and related quantities.

The Shannon Noiseless coding theorem (the data-compression principle) asserts that for an information source with an alphabet X = {0,. .. , − 1} and an asymptotic equipartition property, one can reduce the number of stored strings (x 0 ,. .. , x n−1) ∈ X n to nh with an arbitrary small error-probability. Here h is the entropy rate of the source (calculated… (More)

The weighted entropy H w φ (X) = H w φ (f) of a random variable X with values x and a probability-mass/density function f is defined as the mean value EI w

- ‹
- 1
- ›