Chang's lemma via Pinsker's inequality

@article{Hambardzumyan2020ChangsLV,
  title={Chang's lemma via Pinsker's inequality},
  author={Lianna Hambardzumyan and Yaqiao Li},
  journal={Discret. Math.},
  year={2020},
  volume={343},
  pages={111496}
}
Abstract Extending the idea in Impagliazzo et al. (2014) we give a short information theoretic proof for Chang’s lemma that is based on Pinsker’s inequality. 
Comparing computational entropies below majority (or: When is the dense model theorem false?)
TLDR
The dense model theorem plays a key role in Green and Tao's proof that the primes contain arbitrarily long arithmetic progressions and has since been connected to a surprisingly wide range of topics in mathematics and computer science, including cryptography, computational complexity, combinatorics and machine learning. Expand
An Improved Linear Programming Bound on the Average Distance of a Binary Code
TLDR
This paper improves Fu-Wei-Yeung's bound on the minimum average Hamming distance by finding a better feasible solution to their dual program and applies the linear programming technique to prove bounds on Fourier weights of a Boolean function of various degrees. Expand

References

SHOWING 1-10 OF 18 REFERENCES
An Entropic Proof of Chang's Inequality
TLDR
An elementary proof using entropy is given of Chang's lemma to obtain a tight constant and give a slight improvement in the case where the variables are highly biased. Expand
An information-theoretic proof of a hypercontractive inequality
In this note I give an information-theoretic proof of the Bonami-Beckner-Gross hypercontractive inequality.
Entropy and Counting ∗
We illustrate the role of information theoretic ideas in combinatorial problems, some of them arising in computer science. We also consider the problem of covering graphs using other graphs, and showExpand
A polynomial bound in Freiman's theorem
.Earlier bounds involved exponential dependence in αin the second estimate. Ourargument combines I. Ruzsa’s method, which we improve in several places, as well asY. Bilu’s proof of Freiman’sExpand
Three tutorial lectures on entropy and counting
We explain the notion of the {\em entropy} of a discrete random variable, and derive some of its basic properties. We then show through examples how entropy can be useful as a combinatorialExpand
A new proof of the graph removal lemma
  • J. Fox
  • Mathematics, Computer Science
  • ArXiv
  • 2010
TLDR
A new proof is given which avoids Szemer edi’s regularity lemma and gives a better bound for the directed and multicolored analogues of the graph removal lemma. Expand
Hypercontractivity Via the Entropy Method
TLDR
It is shown that Shearer’s Lemma and elementary arguments about the entropy of random variables are sufficient to recover the optimal Hypercontractive Inequality for all even integers q. Expand
Some applications of relative entropy in additive combinatorics
Abstract This survey looks at some recent applications of relative entropy in additive combinatorics. Specifically, we examine to what extent entropy-increment arguments can replace or evenExpand
On the Erdős Discrepancy Problem
TLDR
It is proved that any completely multiplicative sequence of size 127,646 or more has discrepancy at least 4, proving the Erdős discrepancy conjecture for discrepancy up to 3 and providing inductive construction rules as well as streamlining methods to improve the lower bounds for sequences of higher discrepancies. Expand
Proof of a hypercontractive estimate via entropy
AbstractConsider the probability spaceW={−1, 1}n with the uniform (=product) measure. Letf: W →R be a function. Letf=ΣfIXI be its unique expression as a multilinear polynomial whereXI=Πi∈Ixi. ForExpand
...
1
2
...