Complexity theoretic limitations on learning DNF's

@inproceedings{Daniely2016ComplexityTL,
  title={Complexity theoretic limitations on learning DNF's},
  author={Amit Daniely and Shai Shalev-Shwartz},
  booktitle={COLT},
  year={2016}
}
Using the recently developed framework of Daniely et al. (2014), we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of ω(log(n)) halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions). 
Highly Cited
This paper has 73 citations. REVIEW CITATIONS

From This Paper

Topics from this paper.

Citations

Publications citing this paper.

74 Citations

0204020142015201620172018
Citations per Year
Semantic Scholar estimates that this publication has 74 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 46 references

Expansion in proof complexity

  • Eli Ben-Sasson
  • In Hebrew University. Citeseer,
  • 2001
Highly Influential
1 Excerpt

Chernoff’s Inequality - A very elementary proof

  • N. Linial, Z. Luria
  • Arxiv preprint arXiv:1403.7739
  • 2014
2 Excerpts

Similar Papers

Loading similar papers…