• Publications
  • Influence
Learning Theory Estimates via Integral Operators and Their Approximations
The regression problem in learning theory is investigated with least square Tikhonov regularization schemes in reproducing kernel Hilbert spaces (RKHS). We follow our previous work and apply theExpand
  • 453
  • 47
  • PDF
Learning Theory: An Approximation Theory Viewpoint: Index
  • 262
  • 35
  • PDF
Shannon sampling II: Connections to learning theory
Abstract We continue our study [S. Smale, D.X. Zhou, Shannon sampling and function reconstruction from point values, Bull. Amer. Math. Soc. 41 (2004) 279–305] of Shannon sampling and functionExpand
  • 179
  • 20
  • PDF
Distributed Learning with Regularized Least Squares
TLDR
We study distributed learning with the least squares regularization scheme in a reproducing kernel Hilbert space (RKHS). Expand
  • 87
  • 20
  • PDF
Multi-kernel regularized classifiers
TLDR
A family of classification algorithms generated from Tikhonov regularization schemes are considered. Expand
  • 147
  • 17
  • PDF
Learning Rates of Least-Square Regularized Regression
TLDR
This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. Expand
  • 209
  • 16
  • PDF
Learning Theory: An Approximation Theory Viewpoint
Preface Foreword 1. The framework of learning 2. Basic hypothesis spaces 3. Estimating the sample error 4. Polynomial decay approximation error 5. Estimating covering numbers 6. Logarithmic decayExpand
  • 162
  • 14
Learning Coordinate Covariances via Gradients
TLDR
We introduce an algorithm that learns gradients from samples in the supervised learning framework. Expand
  • 79
  • 14
  • PDF
The covering number in learning theory
  • Ding-Xuan Zhou
  • Computer Science, Mathematics
  • J. Complex.
  • 1 September 2002
TLDR
We show that the eigenfunctions of the Hilbert-Schmidt operator LK associated with a Mercer kernel K may not be uniformly bounded. Expand
  • 260
  • 13
  • PDF
...
1
2
3
4
5
...