Optimal inapproximability results for MAX-CUT and other 2-variable CSPs?
- Guy Kindler, R. O'Donnell, Subhash Khot, Elchanan Mossel
- Computer Science, Physics45th Annual IEEE Symposium on Foundations of…
- 17 October 2004
Though it is unable to prove the majority is stablest conjecture, some partial results are enough to imply that MAX-CUT is hard to (3/4 + 1/(2/spl pi/) + /spl epsi/)-approximate (/spl ap/ .909155), assuming only the unique games conjecture.
Noise stability of functions with low influences: Invariance and optimality
- Elchanan Mossel, R. O'Donnell, K. Oleszkiewicz
- Mathematics, Computer ScienceIEEE Annual Symposium on Foundations of Computer…
- 23 March 2005
An invariance principle for multilinear polynomials with low influences and bounded degree is proved; it shows that under mild conditions the distribution of such polynmials is essentially invariant for all product spaces.
Analysis of Boolean Functions
- R. O'Donnell
- Computer ScienceArXiv
- 5 June 2014
This text gives a thorough overview of Boolean functions, beginning with the most basic definitions and proceeding to advanced topics such as hypercontractivity and isoperimetry, and includes a "highlight application" such as Arrow's theorem from economics.
Learning juntas
- Elchanan Mossel, R. O'Donnell, R. Servedio
- Computer ScienceSymposium on the Theory of Computing
- 9 June 2003
The algorithm and analysis exploit new structural properties of Boolean functions and obtain the first polynomial factor improvement on the naive n-k time bound which can be achieved via exhaustive search.
Every decision tree has an influential variable
- R. O'Donnell, M. Saks, O. Schramm, R. Servedio
- Computer ScienceIEEE Annual Symposium on Foundations of Computer…
- 15 August 2005
A very easy proof that the randomized query complexity of nontrivial monotone graph properties is at least/spl Omega/(v/sup 4/3//p/sup 1/3/), where v is the number of vertices and p /spl les/ 1/2 is the critical threshold probability.
Optimal Lower Bounds for Locality-Sensitive Hashing (Except When q is Tiny)
- R. O'Donnell, Yi Wu, Yuan Zhou
- Computer ScienceTOCT
- 1 December 2009
The “optimal” lower bound for Locality-Sensitive Hashing (LSH) must be at least 1/<i>c</i> (minus <i>o</i><sub>d</sub>(1) is shown, following almost immediately from the observation that the noise stability of a boolean function at time <i-t</i) is a log-convex function of <i*t</ i.
Optimal mean-based algorithms for trace reconstruction
- Anindya De, R. O'Donnell, R. Servedio
- Computer Science, MathematicsSymposium on the Theory of Computing
- 9 December 2016
For any constant deletion rate 0 < Ω < 1, a mean-based algorithm is given that uses exp(O(n1/3) time and traces; it is proved that any mean- based algorithm must use at least exp(Ω(n 1/3)) traces; and a surprising result is found: for deletion probabilities δ > 1/2, the presence of insertions can actually help with trace reconstruction.
Learning intersections and thresholds of halfspaces
- Adam R. Klivans, R. O'Donnell, R. Servedio
- Mathematics, Computer ScienceThe 43rd Annual IEEE Symposium on Foundations of…
- 16 November 2002
We give the first polynomial time algorithm to learn any function of a constant number of halfspaces under the uniform distribution to within any constant error parameter. We also give the first…
Testing Halfspaces
- Kevin Matulef, R. O'Donnell, R. Rubinfeld, R. Servedio
- MathematicsSIAM journal on computing (Print)
- 4 January 2009
This paper addresses the problem of testing whether a Boolean-valued function f is a halfspace, i.e. a function of the form f(x) = sgn(w · x - θ) by giving an algorithm that distinguishes halfspaces from functions that are e-far from any halfspace using only poly(1/e) queries, independent of the dimension n.
Learning Geometric Concepts via Gaussian Surface Area
- Adam R. Klivans, R. O'Donnell, R. Servedio
- Computer Science49th Annual IEEE Symposium on Foundations of…
- 25 October 2008
Gaussian surface area essentially characterizes the computational complexity of learning under the Gaussian distribution, and this is the first subexponential time algorithm for learning general convex sets even in the noise-free (PAC) model.
...
...