# Concentration of Measure Inequalities in Information Theory, Communications, and Coding

@article{Raginsky2013ConcentrationOM, title={Concentration of Measure Inequalities in Information Theory, Communications, and Coding}, author={Maxim Raginsky and Igal Sason}, journal={Found. Trends Commun. Inf. Theory}, year={2013}, volume={10}, pages={1-246} }

Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory, information theory, theoretical computer science, learning theory, and dynamical systems. Concentration of Measure Inequalities in Information Theory, Communications, and Coding…

## Figures and Tables from this paper

## 207 Citations

### On Strong Data-Processing and Majorization Inequalities with Applications to Coding Problems

- Computer ScienceArXiv
- 2021

This work provides data-processing and majorization inequalities for f -divergences, and it considers some of their applications to coding problems, and non-asymptotic bounds are derived for lossless data compression of discrete memoryless sources.

### Information Theory from A Functional Viewpoint

- Computer Science
- 2018

This thesis proposes a new methodology of deriving converse bounds based on convex duality and the reverse hypercontractivity of Markov semigroups and uses the functional inequality for the so-called Eγ metric to prove non-asymptotic achievability (i.e. existence) bounds for several problems including source coding, wiretap channels and mutual covering.

### Concentration Inequalities

- Computer Science
- 2018

Concentration inequalities are statements about the probability that some random variable deviates from its mean that are used as proof techniques to establish fairly fundamental results, or as a tool to analyze the performance of randomized algorithms, such as in statistical learning.

### Generalizations of Fano’s Inequality for Conditional Information Measures via Majorization Theory †

- MathematicsEntropy
- 2020

The derivation is the construction of an appropriate conditional distribution inducing a desired marginal distribution on a countably infinite alphabet based on the infinite-dimensional version of Birkhoff’s theorem proven by Révész.

### Strong Data Processing Inequalities and $\Phi $ -Sobolev Inequalities for Discrete Channels

- MathematicsIEEE Transactions on Information Theory
- 2016

This paper presents a systematic study of optimal constants in the SDPIs for discrete channels, including their variational characterizations, upper and lower bounds, structural results for channels on product probability spaces, and the relationship between the SSPIs and the so-called Φ-Sobolev inequalities.

### Some Useful Integral Representations for Information-Theoretic Analyses

- MathematicsEntropy
- 2020

When applied to the calculation of a moment of the sum of a large number, n, of nonnegative random variables, it is clear that integration over one or two dimensions is significantly easier than the alternative of integrating over n dimensions, as needed in the direct calculation of the desired moment.

### Information in probability: Another information-theoretic proof of a finite de Finetti theorem

- Computer ScienceArXiv
- 2022

An upper bound on the relative entropy is derived between the distribution of the distribution in a sequence of exchangeable random variables, and an appropriate mixture over product distributions, using de Finetti’s classical representation theorem as a corollary.

### Generalizations of Talagrand Inequality for Sinkhorn Distance Using Entropy Power Inequality

- Computer ScienceEntropy
- 2022

It is shown that the quadratic cost in entropic OT can be upper-bounded using entropy power inequality (EPI)-type bounds and that the geometry observed by Sinkhorn distance is smoothed in the sense of measure concentration.

### New-Type Hoeffding's Inequalities and Application in Tail Bounds

- MathematicsArXiv
- 2021

A new type of Hoeffding's inequalities is presented, where the high order moments of random variables are taken into account and can get some considerable improvements in the tail bounds evaluation compared with the known results.

### Matrix Poincaré, Φ-Sobolev inequalities, and quantum ensembles

- MathematicsJournal of Mathematical Physics
- 2019

Sobolev-type inequalities have been extensively studied in the frameworks of real-valued functions and non-commutative Lp spaces, and have proven useful in bounding the time evolution of…

## References

SHOWING 1-10 OF 216 REFERENCES

### Concentration Inequalities - A Nonasymptotic Theory of Independence

- MathematicsConcentration Inequalities
- 2013

Deep connections with isoperimetric problems are revealed whilst special attention is paid to applications to the supremum of empirical processes.

### On Measures of Entropy and Information

- Computer Science
- 2015

5 Euclidian metrics 5 Manhattan distance 7 Manhattan distance 5 Euclidian distance 5 Minkowski distance 6 Symmetric Csiszár f-divergences 6 CsisZárf-Divergence 6 Total variational distance.

### Information theoretic inequalities

- MathematicsIEEE Trans. Inf. Theory
- 1991

The authors focus on the entropy power inequality (including the related Brunn-Minkowski, Young's, and Fisher information inequalities) and address various uncertainty principles and their interrelations.

### Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition

- Computer Science
- 2011

This new edition presents unique discussions of information theoretic secrecy and of zero-error information theory, including the deep connections of the latter with extremal combinatorics.

### Information Theory and Statistics: A Tutorial

- Computer Science, MathematicsFound. Trends Commun. Inf. Theory
- 2004

This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting, and an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.

### Variations on the Gallager bounds, connections, and applications

- Computer ScienceIEEE Trans. Inf. Theory
- 2002

This work discusses many reported upper bounds on the maximum-likelihood (ML) decoding error probability and demonstrates the underlying connections that exist between them, and addresses the Gallager bounds and their variations.

### Probability theory and combinatorial optimization

- Mathematics
- 1987

Preface 1. First View of Problems and Methods. A first example. Long common subsequences Subadditivity and expected values Azuma's inequality and a first application A second example. The…

### Concentration inequalities with exchangeable pairs (Ph.D. thesis)

- Mathematics
- 2005

The purpose of this dissertation is to introduce a version of Stein's method of exchangeable pairs to solve problems in measure concentration. We specifically target systems of dependent random…

### Properties of isoperimetric, functional and Transport-Entropy inequalities via concentration

- Mathematics
- 2009

Various properties of isoperimetric, functional, Transport-Entropy and concentration inequalities are studied on a Riemannian manifold equipped with a measure, whose generalized Ricci curvature is…

### A measure concentration inequality for contracting markov chains

- Mathematics
- 1996

The concentration of measure phenomenon in product spaces means the following: if a subsetA of then'th power of a probability space Χ does not have too small a probability then very large probability…