# Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality

@article{Anantharam2019UnifyingTB, title={Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality}, author={Venkat Anantharam and Varun Jog and Chandra Nair}, journal={2019 IEEE International Symposium on Information Theory (ISIT)}, year={2019}, pages={1847-1851} }

The entropy power inequality (EPI) and the Brascamp-Lieb inequality (BLI) are fundamental inequalities concerning the differential entropies of linear transformations of random vectors. The EPI provides lower bounds for the differential entropy of linear transformations of random vectors with independent components. The BLI, on the other hand, provides upper bounds on the differential entropy of a random vector in terms of the differential entropies of some of its linear transformations. In…

## Figures from this paper

## 7 Citations

Smoothing Brascamp-Lieb Inequalities and Strong Converses of Coding Theorems

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2020

New single-shot converse bounds for the omniscient helper common randomness generation problem and the Gray-Wyner source coding problem are derived in terms of the smooth BL divergence, where the proof relies on the functional formulation of the Brascamp-Lieb inequality.

Quantum Brascamp-Lieb Dualities

- Mathematics, Physics
- 2019

Brascamp-Lieb inequalities are entropy inequalities which have a dual formulation as generalized Young inequalities. In this work, we introduce a fully quantum version of this duality, relating…

Euclidean Forward–Reverse Brascamp–Lieb Inequalities: Finiteness, Structure, and Extremals

- Mathematics, Computer ScienceArXiv
- 2019

The main results concerning finiteness, structure, and Gaussian-extremizability for the Brascamp–Lieb inequality due to Bennett, Carbery, Christ, and Tao are generalized to the setting of the forward–reverse Brascamps–LieB inequality.

Transportation Proof of an inequality by Anantharam, Jog and Nair

- Mathematics, Computer ScienceArXiv
- 2019

Anantharam, Jog and Nair recently put forth an entropic inequality which simultaneously generalizes the Shannon-Stam entropy power inequality and the Brascamp-Lieb inequality in entropic form. We…

On the structure of certain non-convex functionals and the Gaussian Z-interference channel

- Computer Science, Mathematics2020 IEEE International Symposium on Information Theory (ISIT)
- 2020

In this paper we establish that a maximizer of a non-convex problem in positive semidefinite matrices has a certain property using information-theoretic methods. Further, we propose a Gaussian…

An Algebraic and Probabilistic Framework for Network Information Theory

- Computer ScienceFound. Trends Commun. Inf. Theory
- 2020

This monograph develops a mathematical framework based on asymptotically good random structured codes, i.e., codes possessing algebraic properties, for network information theory that is applicable to arbitrary instances of the multi-terminal communication problems under consideration.

Extracting Robust and Accurate Features via a Robust Information Bottleneck

- Computer Science, MathematicsIEEE Journal on Selected Areas in Information Theory
- 2020

This work proposes a novel strategy for extracting features in supervised learning that can be used to construct a classifier which is more robust to small perturbations in the input space, by introducing an additional penalty term that encourages the Fisher information of the extracted features to be small when parametrized by the inputs.

## References

SHOWING 1-10 OF 55 REFERENCES

Subadditivity of The Entropy and its Relation to Brascamp–Lieb Type Inequalities

- Mathematics
- 2007

We prove a general duality result showing that a Brascamp–Lieb type inequality is equivalent to an inequality expressing subadditivity of the entropy, with a complete correspondence of best constants…

Information Theoretic Proofs of Entropy Power Inequalities

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2011

A new and brief proof of the EPI is developed through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs.

Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities †

- Computer Science, MedicineEntropy
- 2018

This work employs the technique of linear matrix inequalities to show that, when the probability density function of X+tZ is log-concave, McKean’s conjecture holds for orders up to at least five.

The Brascamp–Lieb Inequalities: Finiteness, Structure and Extremals

- Mathematics
- 2005

Abstract.We consider the Brascamp–Lieb inequalities concerning multilinear integrals of products of functions in several dimensions. We give a complete treatment of the issues of finiteness of the…

A Forward-Reverse Brascamp-Lieb Inequality: Entropic Duality and Gaussian Optimality

- Medicine, MathematicsEntropy
- 2018

A functional inequality is introduced that unifies both the Brascamp-Lieb inequality and Barthe’s inequality, which is a reverse form of the BrASCamp- Lieb inequality, and its equivalent entropic formulation for Polish spaces is proved.

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2007

A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.

Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2019

It is shown that the class of fractionally superadditive set functions provides an outer bound to the Stam region, resolving a conjecture of Barron and Madiman.

A Strong Entropy Power Inequality

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2018

When one of the random summands is Gaussian, the entropy power inequality (EPI) is sharpen in terms of the strong data processing function for Gaussian channels, which leads to a new reverse EPI and sharpens Stam’s uncertainty principle relating entropy power and Fisher information.

Information theoretic inequalities

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1991

The authors focus on the entropy power inequality (including the related Brunn-Minkowski, Young's, and Fisher information inequalities) and address various uncertainty principles and their interrelations.

Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon

- Computer Science, MathematicsInf. Control.
- 1959

A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon and this constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables.