Corpus ID: 236965641

On Learning and Testing Decision Tree

@article{Bshouty2021OnLA,
  title={On Learning and Testing Decision Tree},
  author={N. Bshouty and Catherine A. Haddad-Zaknoon},
  journal={ArXiv},
  year={2021},
  volume={abs/2108.04587}
}
In this paper, we study learning and testing decision tree of size and depth that are significantly smaller than the number of attributes n. Our main result addresses the problem of poly(n, 1/ǫ) time algorithms with poly(s, 1/ǫ) query complexity (independent of n) that distinguish between functions that are decision trees of size s from functions that are ǫ-far from any decision tree of size φ(s, 1/ǫ), for some function φ > s. The best known result is the recent one that follows from Blank… Expand

Figures from this paper

References

SHOWING 1-10 OF 33 REFERENCES
Learning decision trees from random examples
Abstract We define the rank of a decision tree and show that for any fixed r , the class of all decision trees of rank at most r on n Boolean variables is learnable from random examples in timeExpand
Efficient Sample Extractors for Juntas with Applications
TLDR
Using the new sample extractor, testing by implicit learning can lead to testers having better query complexity than those tailored to a specific problem, such as the tester of Parnas et al. Expand
Decision Tree Approximations of Boolean Functions
TLDR
It is shown that, given an alternative representation of a Boolean function f, one can find a decision tree T which approximates f to any desired amount of accuracy and this result implies proper PAC-learnability of decision trees under the uniform distribution without using membership queries. Expand
Testing Problems with Sublearning Sample Complexity
TLDR
This work studies the problem of determining, for a class of functions H, whether an unknown target function f is contained in H or is “far” from any function in H, and demonstrates that the number of examples required for testing grows only as O(s1/2+?) (where ? is any small constant). Expand
Exact Learning when Irrelevant Variables Abound
TLDR
The proofs of the negative results show that general decision trees and related representations are not learnable in polynomial time using equivalence queries alone, confirming a folklore theorem. Expand
Adaptive Exact Learning of Decision Trees from Membership Queries
TLDR
This paper studies the adaptive learnability of decision trees of depth at most $d$ from membership queries and solves the problem in a randomized polynomial time algorithm and improves the query complexity of both algorithms. Expand
Simple Learning Algorithms for Decision Trees and Multivariate Polynomials
TLDR
This new approach yields simple learning algorithms for multivariate polynomials and decision trees over finite fields under any constant bounded product distribution and gives a learning algorithm for an O(log n)-depth decision tree from membership queries only and a new learning algorithm of any multivariatePolynomial over sufficiently large fields from membership query only. Expand
Testing for Concise Representations
TLDR
The approach combines ideas from the junta test of Fischer et al. 16 with ideas from learning theory, and yields property testers that make po!y(s/epsiv) queries for Boolean function classes such as s-term DNF formulas and s-sparse polynomials over finite fields. Expand
Almost optimal distribution-free junta testing
  • N. Bshouty
  • Computer Science, Mathematics
  • Computational Complexity Conference
  • 2019
TLDR
This paper gives a simple two-sided error adaptive algorithm that makes Õ{k/ϵ) queries and shows how this can be used to solve the problem of testing whether an unknown n-variable Boolean function is a k-junta in the distribution-free property testing model. Expand
Testing and reconstruction via decision trees
TLDR
The results yield reconstruction algorithms for numerous other boolean function properties -- Fourier degree, randomized and quantum query complexities, certificate complexity, sensitivity, etc -- which in turn yield new testers for these properties. Expand
...
1
2
3
4
...