# Learning Theory: An Approximation Theory Viewpoint

@inproceedings{Cucker2007LearningTA, title={Learning Theory: An Approximation Theory Viewpoint}, author={F. Cucker and Ding-Xuan Zhou}, year={2007} }

Preface Foreword 1. The framework of learning 2. Basic hypothesis spaces 3. Estimating the sample error 4. Polynomial decay approximation error 5. Estimating covering numbers 6. Logarithmic decay approximation error 7. On the bias-variance problem 8. Regularization 9. Support vector machines for classification 10. General regularized classifiers Bibliography Index.

#### 162 Citations

Learning Rates of Least-Square Regularized Regression

- Mathematics, Computer Science
- Found. Comput. Math.
- 2006

210- PDF

Coefficient Regularized Algorithms for Learning and Classification

- 2012 Second International Conference on Intelligent System Design and Engineering Application
- 2012

Classification with Gaussians and convex loss II: improving error bounds by noise conditions

- Mathematics
- 2011

7

Iterative Regularization for Learning with Convex Loss Functions

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2016

25- PDF

Analysis of Regression Algorithms with Unbounded Sampling

- Computer Science, Medicine
- Neural Computation
- 2020