# A Composition Theorem for the Fourier Entropy-Influence Conjecture

@article{ODonnell2013ACT, title={A Composition Theorem for the Fourier Entropy-Influence Conjecture}, author={Ryan O'Donnell and Li-Yang Tan}, journal={ArXiv}, year={2013}, volume={abs/1304.1347} }

The Fourier Entropy-Influence (FEI) conjecture of Friedgut and Kalai [1] seeks to relate two fundamental measures of Boolean function complexity: it states that H[f]≤C· Inf[f] holds for every Boolean function f, where H[f] denotes the spectral entropy of f, Inf[f] is its total influence, and C>0 is a universal constant. Despite significant interest in the conjecture it has only been shown to hold for a few classes of Boolean functions.
Our main result is a composition theorem for the FEI…

## 15 Citations

### Fourier Entropy-Influence Conjecture for Random Linear Threshold Functions

- Mathematics, Computer ScienceLATIN
- 2018

This paper considers two natural distributions on the weights defining a linear threshold function, namely uniform distribution on \([-1,1]\) and Normal distribution, and concludes that the FEI conjecture holds for a “random”linear threshold function.

### Improved Bounds on Fourier Entropy and Min-entropy

- MathematicsElectron. Colloquium Comput. Complex.
- 2018

A conjecture is posed: no flat polynomial (whose non-zero Fourier coefficients have the same magnitude) of degree d and sparsity 2^ω(d) can 1/3-approximate a Boolean function.

### On the Fourier Entropy Influence Conjecture for Extremal Classes

- Mathematics, Computer ScienceArXiv
- 2018

This work proves the conjecture for extremal cases, functions with small influence and functions with high entropy, and suggests a direction for proving FEI for read-k DNFs, and proves the Fourier Min-Entropy/Influence (FMEI) Conjecture for regular read-K DNF's.

### Towards a Proof of the Fourier–Entropy Conjecture?

- Mathematics2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS)
- 2020

A new concentration result on the Fourier spectrum of Boolean functions with small total influence implies that the class of functions whose total influence is at most $K$ is agnostically learnable in time $2^{O(K\log K)}$, using membership queries.

### Decision trees, protocols and the entropy-influence conjecture

- MathematicsITCS
- 2014

A natural interpretation of the FEI conjecture, which states that there exists a communication protocol which, given subset S of [n] distributed as ƒ2, can communicate the value of S using at most C⋅Inf[ƒ] bits in expectation, is studied.

### Improved Lower Bounds for the Fourier Entropy/Influence Conjecture via Lexicographic Functions

- MathematicsArXiv
- 2017

A Lipschitz-type condition on the total influence and spectral entropy is proved, which may be of independent interest, and three explicit asymptotic constructions are presented that improve upon the previously best known lower bound on the value of C.

### Cryptographic Boolean functions with biased inputs

- Computer Science, MathematicsCryptography and Communications
- 2015

If p is allowed to take up complex values then a framework involving quantum Boolean functions can be introduced, which provides a connection between Walsh-Hadamard transform, nega-HadAmard transform and Boolean functions with biased inputs.

### Decision Trees, Protocols, and the Fourier Entropy-Influence Conjecture

- MathematicsArXiv
- 2013

A natural interpretation of the Fourier Entropy-Influence conjecture, which states that there exists a communication protocol which, given subset of subsets of $[n]$ distributed as $\widehat{f}^2$, can communicate the value of $S$ using at most $C\cdot\operatorname{Inf}[f]$ bits in expectation.

### A Composition Theorem for Parity Kill Number

- Computer Science2014 IEEE 29th Conference on Computational Complexity (CCC)
- 2014

It is proved that if f is not a parity function, then C<sub>min</sub><sup>⊕</sup>[f<sup>ok</sup>] ≥ Ω (C<sub*min</ sub><sup]⊁</sup>), which is essentially super multiplicative in the normal kill number of f (also known as the minimum certificate complexity).

## References

SHOWING 1-10 OF 19 REFERENCES

### The Fourier Entropy-Influence Conjecture for Certain Classes of Boolean Functions

- MathematicsICALP
- 2011

This work verified the Fourier Entropy-Influence Conjecture for symmetric functions for functions with symmetry group Sn1 × ... × Snd where d is constant and verified the conjecture for functions computable by read-once decision trees.

### Every monotone graph property has a sharp threshold

- Mathematics
- 1996

In their seminal work which initiated random graph theory Erdos and Renyi discovered that many graph properties have sharp thresholds as the number of vertices tends to infinity. We prove a…

### The influence of variables in product spaces

- Mathematics
- 1992

AbstractLetX be a probability space and letf: Xn → {0, 1} be a measurable map. Define the influence of thek-th variable onf, denoted byIf(k), as follows: Foru=(u1,u2,…,un−1) ∈Xn−1 consider the…

### Mansour's Conjecture is True for Random DNF Formulas

- MathematicsCOLT
- 2010

This work makes the first progress on the conjecture that for every DNF formula on n variables with t terms there exists a polynomial p with t O(log(1/ǫ)) non-zero coefficients such that Ex2{0,1}n[(p(x) − f(x)) 2 ] ≤ ǫ.

### Learning monotone decision trees in polynomial time

- Computer Science, Mathematics21st Annual IEEE Conference on Computational Complexity (CCC'06)
- 2006

This is the first algorithm that can learn arbitrary monotone Boolean functions to high accuracy, using random examples only, in time polynomial in a reasonable measure of the complexity of f.

### Learning Boolean Functions via the Fourier Transform

- Computer Science
- 1994

The Fourier Transform representation for functions whose inputs are boolean has been far less studied, but it seems that it can be used to learn many classes of boolean functions.

### On learning monotone DNF under product distributions

- Mathematics, Computer ScienceInf. Comput.
- 2001

We show that the class of monotone 2O(√ n)-term DNF formulae can be PAC learned in polynomial time under the uniform distribution from random examples only. This is an exponential improvement over…

### Improved Pseudorandom Generators for Depth 2 Circuits

- Computer Science, MathematicsAPPROX-RANDOM
- 2009

It is shown that seed length O(log mn ċ log 1/δ) suffices, which is an improvement for large δ, and it follows from the existence of a poly(n,m)-time computable pseudorandom generator that a 1/mO( log mn)-biased distribution 1/poly(nm)-fools DNFs with m terms and n variables are proved.

### Boolean Functions With Low Average Sensitivity Depend On Few Coordinates

- MathematicsComb.
- 1998

It is shown here that if the average sensitivity of is then can be approximated by a function depending on coordinates where is a constant depending only on the accuracy of the approximation but not on .