Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality

@article{Anantharam2019UnifyingTB,
  title={Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality},
  author={Venkat Anantharam and Varun Jog and Chandra Nair},
  journal={2019 IEEE International Symposium on Information Theory (ISIT)},
  year={2019},
  pages={1847-1851}
}
The entropy power inequality (EPI) and the Brascamp-Lieb inequality (BLI) are fundamental inequalities concerning the differential entropies of linear transformations of random vectors. The EPI provides lower bounds for the differential entropy of linear transformations of random vectors with independent components. The BLI, on the other hand, provides upper bounds on the differential entropy of a random vector in terms of the differential entropies of some of its linear transformations. In… Expand

Figures from this paper

Smoothing Brascamp-Lieb Inequalities and Strong Converses of Coding Theorems
TLDR
New single-shot converse bounds for the omniscient helper common randomness generation problem and the Gray-Wyner source coding problem are derived in terms of the smooth BL divergence, where the proof relies on the functional formulation of the Brascamp-Lieb inequality. Expand
Quantum Brascamp-Lieb Dualities
Brascamp-Lieb inequalities are entropy inequalities which have a dual formulation as generalized Young inequalities. In this work, we introduce a fully quantum version of this duality, relatingExpand
Euclidean Forward–Reverse Brascamp–Lieb Inequalities: Finiteness, Structure, and Extremals
TLDR
The main results concerning finiteness, structure, and Gaussian-extremizability for the Brascamp–Lieb inequality due to Bennett, Carbery, Christ, and Tao are generalized to the setting of the forward–reverse Brascamps–LieB inequality. Expand
Transportation Proof of an inequality by Anantharam, Jog and Nair
Anantharam, Jog and Nair recently put forth an entropic inequality which simultaneously generalizes the Shannon-Stam entropy power inequality and the Brascamp-Lieb inequality in entropic form. WeExpand
On the structure of certain non-convex functionals and the Gaussian Z-interference channel
In this paper we establish that a maximizer of a non-convex problem in positive semidefinite matrices has a certain property using information-theoretic methods. Further, we propose a GaussianExpand
An Algebraic and Probabilistic Framework for Network Information Theory
TLDR
This monograph develops a mathematical framework based on asymptotically good random structured codes, i.e., codes possessing algebraic properties, for network information theory that is applicable to arbitrary instances of the multi-terminal communication problems under consideration. Expand
Extracting Robust and Accurate Features via a Robust Information Bottleneck
TLDR
This work proposes a novel strategy for extracting features in supervised learning that can be used to construct a classifier which is more robust to small perturbations in the input space, by introducing an additional penalty term that encourages the Fisher information of the extracted features to be small when parametrized by the inputs. Expand

References

SHOWING 1-10 OF 55 REFERENCES
Subadditivity of The Entropy and its Relation to Brascamp–Lieb Type Inequalities
We prove a general duality result showing that a Brascamp–Lieb type inequality is equivalent to an inequality expressing subadditivity of the entropy, with a complete correspondence of best constantsExpand
Information Theoretic Proofs of Entropy Power Inequalities
  • O. Rioul
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2011
TLDR
A new and brief proof of the EPI is developed through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs. Expand
Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities †
TLDR
This work employs the technique of linear matrix inequalities to show that, when the probability density function of X+tZ is log-concave, McKean’s conjecture holds for orders up to at least five. Expand
The Brascamp–Lieb Inequalities: Finiteness, Structure and Extremals
Abstract.We consider the Brascamp–Lieb inequalities concerning multilinear integrals of products of functions in several dimensions. We give a complete treatment of the issues of finiteness of theExpand
A Forward-Reverse Brascamp-Lieb Inequality: Entropic Duality and Gaussian Optimality
TLDR
A functional inequality is introduced that unifies both the Brascamp-Lieb inequality and Barthe’s inequality, which is a reverse form of the BrASCamp- Lieb inequality, and its equivalent entropic formulation for Polish spaces is proved. Expand
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
TLDR
A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums. Expand
Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region
TLDR
It is shown that the class of fractionally superadditive set functions provides an outer bound to the Stam region, resolving a conjecture of Barron and Madiman. Expand
A Strong Entropy Power Inequality
  • T. Courtade
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2018
TLDR
When one of the random summands is Gaussian, the entropy power inequality (EPI) is sharpen in terms of the strong data processing function for Gaussian channels, which leads to a new reverse EPI and sharpens Stam’s uncertainty principle relating entropy power and Fisher information. Expand
Information theoretic inequalities
TLDR
The authors focus on the entropy power inequality (including the related Brunn-Minkowski, Young's, and Fisher information inequalities) and address various uncertainty principles and their interrelations. Expand
Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon
  • A. J. Stam
  • Computer Science, Mathematics
  • Inf. Control.
  • 1959
TLDR
A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon and this constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables. Expand
...
1
2
3
4
5
...