• Corpus ID: 219980646

A Constructive, Type-Theoretic Approach to Regression via Global Optimisation

@article{Ghica2020ACT,
  title={A Constructive, Type-Theoretic Approach to Regression via Global Optimisation},
  author={Dan R. Ghica and Todd Waugh Ambridge},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.12868}
}
We examine the connections between deterministic, complete, and general global optimisation of continuous functions and a general concept of regression from the perspective of constructive type theory via the concept of 'searchability'. We see how the property of convergence of global optimisation is a straightforward consequence of searchability. The abstract setting allows us to generalise searchability and continuity to higher-order functions, so that we can formulate novel convergence… 

Figures from this paper

References

SHOWING 1-10 OF 33 REFERENCES
Complete search in continuous global optimization and constraint satisfaction
This survey covers the state of the art of techniques for solving general-purpose constrained global optimization problems and continuous constraint satisfaction problems, with emphasis on complete
Algorithmic solution of higher type equations
TLDR
If x is unique and X and Y are subspaces of Kleene– Kreisel spaces of continuous functionals with X exhaustible, then x is computable uniformly in f, y and the exhaustibility condition, and it is shown that it is semi-decidable whether a function defined on such a compact set fails to be analytic.
Synthetic Topology: of Data Types and Classical Spaces
  • M. Escardó
  • Mathematics
    Electron. Notes Theor. Comput. Sci.
  • 2004
An overview of gradient descent optimization algorithms
TLDR
This article looks at different variants of gradient descent, summarize challenges, introduce the most common optimization algorithms, review architectures in a parallel and distributed setting, and investigate additional strategies for optimizing gradient descent.
Minimization by Random Search Techniques
TLDR
Two general convergence proofs for random search algorithms are given and how these extend those available for specific variants of the conceptual algorithm studied here are shown.
Constructive Approximation
TLDR
This paper works on [-1, 1 ] and obtains Markov-type estimates for the derivatives of polynomials from a rather wide family of classes of constrained polynomes and results turn out to be sharp.
A theory of the learnable
TLDR
This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Semidefinite Relaxations of Fractional Programs via Novel Convexification Techniques
TLDR
This work develops the convex envelope and concave envelope of z=x/y over a hypercube and proposes a new relaxation technique for fractional programs which includes the derived envelopes.
Branch-and-Bound Methods: A Survey
TLDR
The essential features of the branch-and-bound approach to constrained optimization are described, and several specific applications are reviewed, including integer linear programming Land-Doig and Balas methods, nonlinear programming minimization of nonconvex objective functions, and the quadratic assignment problem Gilmore and Lawler methods.
A review of recent advances in global optimization
This paper presents an overview of the research progress in deterministic global optimization during the last decade (1998–2008). It covers the areas of twice continuously differentiable nonlinear
...
...