# Elements of Information Theory

@inproceedings{Cover1991ElementsOI, title={Elements of Information Theory}, author={Thomas M. Cover and Joy A. Thomas}, year={1991} }

Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's…

## 43,048 Citations

The Homological Nature of Entropy

- Mathematics, Computer ScienceEntropy
- 2015

It is proposed that entropy is a universal co-homological class in a theory associated to a family of observable quantities and aFamily of probability distributions, that gives rise to a new kind of topology for information processes, that accounts for the main information functions.

On Divergences and Informations in Statistics and Information Theory

- Computer ScienceIEEE Transactions on Information Theory
- 2006

The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. All…

Mutual dimension, data processing inequalities, and randomness

- Computer Science
- 2016

A framework for mutual dimension is developed, i.e., the density of algorithmic mutual information between two infinite objects, that has similar properties as those of classical Shannon mutual information.

Information Theoretic Proofs of Entropy Power Inequalities

- Computer ScienceIEEE Transactions on Information Theory
- 2011

A new and brief proof of the EPI is developed through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs.

Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy

- Computer ScienceArXiv
- 2005

The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, and new, generalized definitions of entropy and cross-entropy - supersets of the Boltzmann principle - applicable to non-multinomial systems are shown.

Classical and Quantum Information Theory: An Introduction for the Telecom Scientist

- Computer Science
- 2009

This paper presents a meta-anatomy of quantum information theory, focusing on the role of entropy in the development of knowledge theory and its applications in medicine, science and engineering.

Two-moment inequalities for Rényi entropy and mutual information

- Mathematics, Computer Science2017 IEEE International Symposium on Information Theory (ISIT)
- 2017

This paper explores some applications of a two-moment inequality for the integral of the r-th power of a function, where 0 < r < 1, and an upper bound on the Rényi entropy of a random vector in terms of the two different moments is set.

On Rényi Information Measures and Their Applications

- Computer Science
- 2020

The contributions of this thesis are new problems related to guessing, task encoding, hypothesis testing, and horse betting are solved; and two new Rényi measures of dependence and a new conditional RényI divergence appearing in these problems are analyzed.

Facets of entropy

- Computer ScienceCommun. Inf. Syst.
- 2015

This expository work is an attempt to present a picture for the many facets of the entropy function, which includes Shannon-type inequalities, non-Shannon- type inequalities, and other constraints.

Machine Learning in Computer Vision

- Computer ScienceComputational Imaging and Vision
- 2005

This book discusses Bayesian Network Classifiers, a model for Bayesian network classification, and its applications in multi-modal event detection, as well as other topics.

## References

SHOWING 1-10 OF 12 REFERENCES

The theory of information and coding: A mathematical framework for communication

- MathematicsProceedings of the IEEE
- 1979

The stochastic integral with respect to processes with values in a reflexive Banach space, Theor.

Coding Theorems for Discrete Memoryless Systems

- Coding Theorems for Discrete Memoryless Systems
- 1981

GALLAGER, Information Theory and Reliable Communication

- 1968

KORNER, Iiiformationi Theory: Coding Theorems for Discrete Memoryless Systems

- 1981

ASH, Information Tlteory, Interscience

- New York,
- 1965

SHANNON, A mathematical theory of communication

- Bell Sys. Tech. J.,
- 1948

MCELEICE, The Theory of Information and Coding: A Mathematical Framework for Communication, Addison-Wesley

- 1977