#### Filter Results:

- Full text PDF available (47)

#### Publication Year

1997

2017

- This year (6)
- Last five years (23)

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

The problem of learning is arguably at the very core of the problem of intelligence, both biological and artificial.

The best understanding of what one can see comes from theories of what one can't see. This thought has been expressed in a number of ways by different scientists, and is supported everywhere. Obvious choices vary from gravity to economic equilibrium. For learning theory we see its expression in the focus on the regression function defined by an unknown… (More)

Preamble I first met René at the well-known 1956 meeting on topology in Mexico City. He then came to the University of Chicago, where I was starting my job as instructor for the fall of 1956. He, Suzanne, Clara and I became good friends and saw much of each other for many decades, especially at IHES in Paris. Thom's encouragement and support were important… (More)

We continue our study [12] of Shannon sampling and function reconstruction. In this paper, the error analysis is improved. The problem of function reconstruction is extended to a more general setting with frames beyond point evaluation. Then we show how our approach can be applied to learning theory: a functional analysis framework is presented; sharp,… (More)

- Sayan Mukherjee, Ding-Xuan Zhou
- Journal of Machine Learning Research
- 2006

We introduce an algorithm that learns gradients from samples in the supervised learning framework. An error analysis is given for the convergence of the gradient estimated by the algorithm to the true gradient. The utility of the algorithm for the problem of variable selection as well as determining variable covariance is illustrated on simulated data as… (More)

- Qiang Wu, Yiming Ying, Ding-Xuan Zhou
- Foundations of Computational Mathematics
- 2006

This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the… (More)

- Ding-Xuan Zhou, Kurt Jetter
- Adv. Comput. Math.
- 2006

- Qiang Wu, Ding-Xuan Zhou
- Computers & Mathematics with Applications
- 2008

- Ding-Xuan Zhou
- J. Complexity
- 2002

The covering number of a ball of a reproducing kernel Hilbert space as a subset of the continuous function space plays an important role in Learning Theory. We give estimates for this covering number by means of the regularity of the Mercer kernel K: For convolution type kernels Kðx; tÞ ¼ kðx À tÞ on ½0; 1 n ; we provide estimates depending on the decay of… (More)

- Di-Rong Chen, Qiang Wu, Yiming Ying, Ding-Xuan Zhou
- Journal of Machine Learning Research
- 2004

The purpose of this paper is to provide a PAC error analysis for the q-norm soft margin classifier, a support vector machine classification algorithm. It consists of two parts: reg-ularization error and sample error. While many techniques are available for treating the sample error, much less is known for the regularization error and the corresponding… (More)