• Corpus ID: 244714780

On computable learning of continuous features

@inproceedings{Ackerman2021OnCL,
  title={On computable learning of continuous features},
  author={Nathanael Leedom Ackerman and Julian Asilis and Jieqi Di and Cameron E. Freer and Jean-Baptiste Tristan},
  year={2021}
}
We introduce definitions of computable PAC learning for binary classification over computable metric spaces. We provide sufficient conditions for learners that are empirical risk minimizers (ERM) to be computable, and bound the strong Weihrauch degree of an ERM learner under more general conditions. We also give a presentation of a hypothesis class that does not admit any proper computable PAC learner with computable sample function, despite the underlying class being PAC learnable. 
On characterizations of learnability with computable learners
TLDR
A characterization of a closely related notion of strong CPAC learning is given, a negative answer to the open problem posed by Agarwal et al. (2021) whether all decidable PAC learnable classes are improperly CPAC learnable, and a study of the arithmetical complexity of learnability.

References

SHOWING 1-10 OF 20 REFERENCES
On Learnability with Computable Learners
TLDR
The notion of CPAC learnability is proposed, by adding some basic computability requirements into a PAC learning framework, and it is shown that in this framework learnability of a binary hypothesis class is not implied by finiteness of its VC-dimension anymore.
PAC learning, VC dimension, and the arithmetic hierarchy
TLDR
This family of concept classes is sufficient to cover all standard examples, and has the property that PAC learnability is equivalent to finite VC dimension.
Statistical Learning of Arbitrary Computable Classifiers
TLDR
This work shows that learning over the set of all computable labeling functions is indeed possible, and develops a learning algorithm, and shows that bounding sample complexity independently of the distribution is impossible.
LEARNING THEORY IN THE ARITHMETIC HIERARCHY
TLDR
In proving the ${\rm{\Sigma }}_5^0$-completeness result for behaviorally correct learning, this work proves a result of independent interest; if a uniformly computably enumerable family is not learnable, then for any computable learner there is a enumeration witnessing failure.
A Computability Perspective on (Verified) Machine Learning
TLDR
The computational tasks underlying verified ML are defined in a model-agnostic way, and it is shown that they are in principle computable.
A theory of the learnable
TLDR
This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Learnability and the Vapnik-Chervonenkis dimension
TLDR
This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Sample Compression, Learnability, and the Vapnik-Chervonenkis Dimension
TLDR
It is demonstrated that the existence of a sample compression scheme of fixed-size for aclass C is sufficient to ensure that the class C is pac-learnable, and the relationship between sample compression schemes and the VC dimension is explored.
Computability on subsets of metric spaces
...
1
2
...