# On computable learning of continuous features

@inproceedings{Ackerman2021OnCL, title={On computable learning of continuous features}, author={Nathanael Leedom Ackerman and Julian Asilis and Jieqi Di and Cameron E. Freer and Jean-Baptiste Tristan}, year={2021} }

We introduce definitions of computable PAC learning for binary classification over computable metric spaces. We provide sufficient conditions for learners that are empirical risk minimizers (ERM) to be computable, and bound the strong Weihrauch degree of an ERM learner under more general conditions. We also give a presentation of a hypothesis class that does not admit any proper computable PAC learner with computable sample function, despite the underlying class being PAC learnable.

## One Citation

On characterizations of learnability with computable learners

- Computer ScienceArXiv
- 2022

A characterization of a closely related notion of strong CPAC learning is given, a negative answer to the open problem posed by Agarwal et al. (2021) whether all decidable PAC learnable classes are improperly CPAC learnable, and a study of the arithmetical complexity of learnability.

## References

SHOWING 1-10 OF 20 REFERENCES

On Learnability with Computable Learners

- Computer Science
- 2020

The notion of CPAC learnability is proposed, by adding some basic computability requirements into a PAC learning framework, and it is shown that in this framework learnability of a binary hypothesis class is not implied by finiteness of its VC-dimension anymore.

PAC learning, VC dimension, and the arithmetic hierarchy

- Computer ScienceArch. Math. Log.
- 2015

This family of concept classes is sufficient to cover all standard examples, and has the property that PAC learnability is equivalent to finite VC dimension.

Statistical Learning of Arbitrary Computable Classifiers

- Computer ScienceArXiv
- 2008

This work shows that learning over the set of all computable labeling functions is indeed possible, and develops a learning algorithm, and shows that bounding sample complexity independently of the distribution is impossible.

LEARNING THEORY IN THE ARITHMETIC HIERARCHY

- Computer Science, MathematicsThe Journal of Symbolic Logic
- 2014

In proving the ${\rm{\Sigma }}_5^0$-completeness result for behaviorally correct learning, this work proves a result of independent interest; if a uniformly computably enumerable family is not learnable, then for any computable learner there is a enumeration witnessing failure.

A Computability Perspective on (Verified) Machine Learning

- Computer ScienceArXiv
- 2021

The computational tasks underlying verified ML are defined in a model-agnostic way, and it is shown that they are in principle computable.

Computability of probability measures and Martin-Löf randomness over metric spaces

- Mathematics, Computer ScienceInf. Comput.
- 2009

A theory of the learnable

- Computer ScienceSTOC '84
- 1984

This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.

Learnability and the Vapnik-Chervonenkis dimension

- Computer ScienceJACM
- 1989

This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.

Sample Compression, Learnability, and the Vapnik-Chervonenkis Dimension

- Computer ScienceMachine Learning
- 2004

It is demonstrated that the existence of a sample compression scheme of fixed-size for aclass C is sufficient to ensure that the class C is pac-learnable, and the relationship between sample compression schemes and the VC dimension is explored.

Computability on subsets of metric spaces

- Mathematics, Computer ScienceTheor. Comput. Sci.
- 2003