# On the $\Phi$-Stability and Related Conjectures

@inproceedings{Yu2021OnT, title={On the \$\Phi\$-Stability and Related Conjectures}, author={Lei Yu}, year={2021} }

Let X be a random variable uniformly distributed on the discrete cube {−1, 1}, and let Tρ be the noise operator acting on Boolean functions f : {−1, 1} → {0, 1}, where ρ ∈ [0, 1] is the noise parameter, representing the correlation coefficient between each coordination of X and its noise-corrupted version. Given a convex function Φ and the mean Ef(X) = a ∈ [0, 1], which Boolean function f maximizes the Φ-stability E [Φ (Tρf(X))] of f? Special cases of this problem include the (symmetric and…

## References

SHOWING 1-10 OF 49 REFERENCES

### On the Entropy of a Noisy Function

- Computer ScienceIEEE Transactions on Information Theory
- 2016

If a Boolean function f is close to a characteristic function g of a subcube of dimension n-1, then the entropy of T<sub>ϵ</sub>f is at most that of Tϵg.

### Which Boolean Functions Maximize Mutual Information on Noisy Inputs?

- Computer ScienceIEEE Transactions on Information Theory
- 2014

This work poses a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs and provides substantial evidence supporting its validity.

### Which Boolean functions are most informative?

- Computer Science2013 IEEE International Symposium on Information Theory
- 2013

This work introduces a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs and provides substantial evidence supporting its validity.

### Noise sensitivity of Boolean functions and applications to percolation

- Mathematics
- 1998

It is shown that a large class of events in a product probability space are highly sensitive to noise, in the sense that with high probability, the configuration with an arbitrary small percent of…

### On the Most Informative Boolean Functions of the Very Noisy Channel

- Computer Science2019 IEEE International Symposium on Information Theory (ISIT)
- 2019

A calculus-based approach is presented to show a dimension-dependent result by examining the second derivative of H(α) − H(f(X n)|Y n) at α = 1/2, and it is shown that the dictator function is the most informative function in the high noise regime.

### An improved upper bound for the most informative boolean function conjecture

- Mathematics, Computer Science2016 IEEE International Symposium on Information Theory (ISIT)
- 2016

A new upper bound is derived that holds for all balanced functions, and improves upon the best known previous bound for α > 1 over 3.

### Remarks on the Most Informative Function Conjecture at fixed mean

- Mathematics
- 2015

A continuous version of the conjecture on the sphere is proved and it implies the previously-known analogue for Gaussian space and Courtade and Kumar's stronger Lex Conjecture fails for small noise rates.

### Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality

- Mathematics, Computer ScienceIsrael Journal of Mathematics
- 2006

NICD, a generalization of noise sensitivity previously considered in [5, 31, 39], is extended to trees and the use of thereverse Bonami-Beckner inequality is used to prove a new isoperimetric inequality for the discrete cube and a new result on the mixing of short random walks on the cube.

### ON SEQUENCES OF PAIRS OF DEPENDENT RANDOM VARIABLES

- Mathematics
- 1975

The generalized random variables $( {x,y} )$ have a given joint distribution. Pairs $( {x_i ,y_i } )$ are drawn independently. The observer of $( {x_1 , \cdots ,x_n } )$ and the observer of $( {y_1 ,…

### A polynomial bound in Freiman's theorem

- Mathematics
- 2002

.Earlier bounds involved exponential dependence in αin the second estimate. Ourargument combines I. Ruzsa’s method, which we improve in several places, as well asY. Bilu’s proof of Freiman’s…