Concentration of Measure Inequalities in Information Theory, Communications, and Coding

@article{Raginsky2013ConcentrationOM,
  title={Concentration of Measure Inequalities in Information Theory, Communications, and Coding},
  author={Maxim Raginsky and Igal Sason},
  journal={Found. Trends Commun. Inf. Theory},
  year={2013},
  volume={10},
  pages={1-246}
}
  • M. Raginsky, I. Sason
  • Published 19 December 2012
  • Computer Science
  • Found. Trends Commun. Inf. Theory
Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory, information theory, theoretical computer science, learning theory, and dynamical systems. Concentration of Measure Inequalities in Information Theory, Communications, and Coding… 

Figures and Tables from this paper

On Strong Data-Processing and Majorization Inequalities with Applications to Coding Problems

TLDR
This work provides data-processing and majorization inequalities for f -divergences, and it considers some of their applications to coding problems, and non-asymptotic bounds are derived for lossless data compression of discrete memoryless sources.

Information Theory from A Functional Viewpoint

TLDR
This thesis proposes a new methodology of deriving converse bounds based on convex duality and the reverse hypercontractivity of Markov semigroups and uses the functional inequality for the so-called Eγ metric to prove non-asymptotic achievability (i.e. existence) bounds for several problems including source coding, wiretap channels and mutual covering.

Concentration Inequalities

TLDR
Concentration inequalities are statements about the probability that some random variable deviates from its mean that are used as proof techniques to establish fairly fundamental results, or as a tool to analyze the performance of randomized algorithms, such as in statistical learning.

Generalizations of Fano’s Inequality for Conditional Information Measures via Majorization Theory †

TLDR
The derivation is the construction of an appropriate conditional distribution inducing a desired marginal distribution on a countably infinite alphabet based on the infinite-dimensional version of Birkhoff’s theorem proven by Révész.

Strong Data Processing Inequalities and $\Phi $ -Sobolev Inequalities for Discrete Channels

  • M. Raginsky
  • Mathematics
    IEEE Transactions on Information Theory
  • 2016
TLDR
This paper presents a systematic study of optimal constants in the SDPIs for discrete channels, including their variational characterizations, upper and lower bounds, structural results for channels on product probability spaces, and the relationship between the SSPIs and the so-called Φ-Sobolev inequalities.

Some Useful Integral Representations for Information-Theoretic Analyses

TLDR
When applied to the calculation of a moment of the sum of a large number, n, of nonnegative random variables, it is clear that integration over one or two dimensions is significantly easier than the alternative of integrating over n dimensions, as needed in the direct calculation of the desired moment.

Information in probability: Another information-theoretic proof of a finite de Finetti theorem

TLDR
An upper bound on the relative entropy is derived between the distribution of the distribution in a sequence of exchangeable random variables, and an appropriate mixture over product distributions, using de Finetti’s classical representation theorem as a corollary.

Generalizations of Talagrand Inequality for Sinkhorn Distance Using Entropy Power Inequality

TLDR
It is shown that the quadratic cost in entropic OT can be upper-bounded using entropy power inequality (EPI)-type bounds and that the geometry observed by Sinkhorn distance is smoothed in the sense of measure concentration.

New-Type Hoeffding's Inequalities and Application in Tail Bounds

TLDR
A new type of Hoeffding's inequalities is presented, where the high order moments of random variables are taken into account and can get some considerable improvements in the tail bounds evaluation compared with the known results.

Matrix Poincaré, Φ-Sobolev inequalities, and quantum ensembles

Sobolev-type inequalities have been extensively studied in the frameworks of real-valued functions and non-commutative Lp spaces, and have proven useful in bounding the time evolution of
...

References

SHOWING 1-10 OF 216 REFERENCES

Concentration Inequalities - A Nonasymptotic Theory of Independence

TLDR
Deep connections with isoperimetric problems are revealed whilst special attention is paid to applications to the supremum of empirical processes.

On Measures of Entropy and Information

TLDR
5 Euclidian metrics 5 Manhattan distance 7 Manhattan distance 5 Euclidian distance 5 Minkowski distance 6 Symmetric Csiszár f-divergences 6 CsisZárf-Divergence 6 Total variational distance.

Information theoretic inequalities

TLDR
The authors focus on the entropy power inequality (including the related Brunn-Minkowski, Young's, and Fisher information inequalities) and address various uncertainty principles and their interrelations.

Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition

TLDR
This new edition presents unique discussions of information theoretic secrecy and of zero-error information theory, including the deep connections of the latter with extremal combinatorics.

Information Theory and Statistics: A Tutorial

TLDR
This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting, and an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.

Variations on the Gallager bounds, connections, and applications

TLDR
This work discusses many reported upper bounds on the maximum-likelihood (ML) decoding error probability and demonstrates the underlying connections that exist between them, and addresses the Gallager bounds and their variations.

Probability theory and combinatorial optimization

Preface 1. First View of Problems and Methods. A first example. Long common subsequences Subadditivity and expected values Azuma's inequality and a first application A second example. The

Concentration inequalities with exchangeable pairs (Ph.D. thesis)

The purpose of this dissertation is to introduce a version of Stein's method of exchangeable pairs to solve problems in measure concentration. We specifically target systems of dependent random

Properties of isoperimetric, functional and Transport-Entropy inequalities via concentration

Various properties of isoperimetric, functional, Transport-Entropy and concentration inequalities are studied on a Riemannian manifold equipped with a measure, whose generalized Ricci curvature is

A measure concentration inequality for contracting markov chains

The concentration of measure phenomenon in product spaces means the following: if a subsetA of then'th power of a probability space Χ does not have too small a probability then very large probability
...