#### Filter Results:

#### Publication Year

2004

2013

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from the one proposed in previous work on this topic ([2], [9]). Here instead of considering the reconstruction error of KPCA we are interested in approximation error bounds for the eigenspaces themselves. We prove an upper bound depending on the spacing between… (More)

The main goal of this paper is to prove inequalities on the reconstruction error for Kernel Principal Component Analysis. With respect to previous work on this topic, our contribution is twofold: (1) we give bounds that explicitly take into account the empirical centering step in this algorithm, and (2) we show that a " localized " approach allows to obtain… (More)

The Huber's Criterion is a useful method for robust regression. The adaptive least absolute shrinkage and selection operator (lasso) is a popular technique for simultaneous estimation and variable selection. The adaptive weights in the adaptive lasso allow to have the oracle properties. In this paper we propose to combine the Huber's criterion and adaptive… (More)

This paper investigates the effect of Kernel Principal Component Analysis (KPCA) within the classification framework, essentially the regular-ization properties of this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a reg-ularization… (More)

A new method for the binary classification problem is studied. It relies on empirical minimization of the hinge loss over an increasing sequence of finite-dimensional spaces. A suitable dimension is picked by minimizing the regularized loss, where the regularization term is proportional to the dimension. An oracle-type inequality is established, which… (More)

A new method for the binary classification problem is studied. It relies on empirical minimization of the hinge risk over an increasing sequence of finite-dimensional spaces. A suitable dimension is picked by minimizing the regularized risk, where the regularization term is proportional to the dimension. An oracle-type inequality is established for the… (More)

- ‹
- 1
- ›