Bounds on the Poincaré constant for convolution measures

@article{Courtade2018BoundsOT,
  title={Bounds on the Poincar{\'e} constant for convolution measures},
  author={Thomas A. Courtade},
  journal={ArXiv},
  year={2018},
  volume={abs/1807.00027}
}
  • T. Courtade
  • Published 29 June 2018
  • Computer Science, Mathematics
  • ArXiv
We establish a Shearer-type inequality for the Poincar\'e constant, showing that the Poincar\'e constant corresponding to the convolution of a collection of measures can be nontrivially controlled by the Poincar\'e constants corresponding to convolutions of subsets of measures. This implies, for example, that the Poincar\'e constant is non-increasing along the central limit theorem. We also establish a dimension-free stability estimate for subadditivity of the Poincar\'e constant on… Expand
Stability of the Bakry-Emery theorem on $\mathbb{R}^n$
We prove stability estimates for the Bakry-Emery bound on Poincare and logarithmic Sobolev constants of uniformly log-concave measures. In particular, we improve the quantitative bound in a result ofExpand
Stability of the Bakry-Émery theorem on Rn
Abstract We establish quantitative stability estimates for the Bakry-Emery bound on logarithmic Sobolev and Poincare constants of uniformly log-concave measures. More specifically, we show that if aExpand
Dimension-free log-Sobolev inequalities for mixture distributions
We prove that if (Px)x∈X is a family of probability measures which satisfy the log-Sobolev inequality and whose pairwise chi-squared divergences are uniformly bounded, and μ is any mixingExpand
Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices
TLDR
A convergence guarantee in Kullback-Leibler (KL) divergence is proved assuming $\nu$ satisfies a log-Sobolev inequality and the Hessian of $f$ is bounded. Expand
Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem
  • O. Johnson
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 2020
TLDR
It is proved that assuming this eigenvalue of the operator associated with the Hirschfeld–Gebelein–Rényi maximal correlation satisfies a strict inequality, an rate of convergence and a strengthened form of monotonicity hold. Expand
Quantitative Weak Convergence for Discrete Stochastic Processes
In this paper, we quantitative convergence in $W_2$ for a family of Langevin-like stochastic processes that includes stochastic gradient descent and related gradient-based algorithms. Under certainExpand
Stability of the Shannon–Stam inequality via the Föllmer process
TLDR
The first stability estimate for general log-concave random vectors in the following form is given, based on a new approach which uses an entropy-minimizing process from stochastic control theory. Expand
M ay 2 01 9 Maximal correlation and the rate of Fisher information convergence in the Central Limit Theorem
We consider the behaviour of the Fisher information of scaled sums of independent and identically distributed random variables in the Central Limit Theorem regime. We show how this behaviour can beExpand
Rapid Convergence of the Unadjusted Langevin Algorithm: Log-Sobolev Suffices
We prove a convergence guarantee on the unadjusted Langevin algorithm for sampling assuming only that the target distribution $e^{-f}$ satisfies a log-Sobolev inequality and the Hessian of $f$ isExpand
Two remarks on generalized entropy power inequalities
TLDR
A counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave random variables is constructed and a complex analogue of a recent dependent entropy power inequality of Hao and Jog is presented. Expand
...
1
2
...

References

SHOWING 1-10 OF 33 REFERENCES
Existence of Stein Kernels under a Spectral Gap, and Discrepancy Bound
TLDR
Two general properties enjoyed by the Stein discrepancy are established, holding whenever a Stein kernel exists: Stein discrepancy is strictly decreasing along the CLT, and it controls the skewness of a random vector. Expand
Logarithmic Sobolev inequalities for mollified compactly supported measures
We show that the convolution of a compactly supported measure on $\mathbb{R}$ with a Gaussian measure satisfies a logarithmic Sobolev inequality (LSI). We use this result to give a new proof of aExpand
Functional inequalities for Gaussian convolutions of compactly supported measures: explicit bounds and dimension dependence
The aim of this paper is to establish various functional inequalities for the convolution of a compactly supported measure and a standard Gaussian distribution on Rd. We especially focus on gettingExpand
CONVERGENCE OF THE POINCAR ´ E CONSTANT ∗
The Poincare constant RY of a random variable Y relates the L 2 (Y )-norm of a function g and its derivative g� . Since RY − D(Y ) is positive, with equality if and only if Y is normal, it can beExpand
Stein’s method, logarithmic Sobolev and transport inequalities
We develop connections between Stein’s approximation method, logarithmic Sobolev and transport inequalities by introducing a new class of functional inequalities involving the relative entropy, theExpand
Poincaré inequalities and dimension free concentration of measure
In this paper, we consider Poincare inequalities for non euclidean metrics on $\mathbb{R}^d$. These inequalities enable us to derive precise dimension free concentration inequalities for productExpand
Diffusion on compact Riemannian manifolds and logarithmic Sobolev inequalities
In several earlier papers [5 71, we have explored the connections between the spectrum of Schrddinger operators and the logarithmic Sobolev inequality of Gross [3], showing in the process theExpand
Poincaré’s inequalities and Talagrand’s concentration phenomenon for the exponential distribution
Summary. We present a simple proof, based on modified logarithmic Sobolev inequalities, of Talagrand’s concentration inequality for the exponential distribution. We actually observe that everyExpand
On fine properties of mixtures with respect to concentration of measure and Sobolev type inequalities
Mixtures are convex combinations of laws. Despite this simple definition, a mixture can be far more subtle than its mixed components. For instance, mixing Gaussian laws may produce a potential withExpand
From dimension free concentration to the Poincaré inequality
We prove that a probability measure on an abstract metric space satisfies a non trivial dimension free concentration inequality for the $$\ell _2$$ℓ2 metric if and only if it satisfies the PoincaréExpand
...
1
2
3
4
...