#### Filter Results:

- Full text PDF available (5)

#### Publication Year

2004

2011

- This year (0)
- Last 5 years (0)
- Last 10 years (2)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Laurent Zwald, Olivier Bousquet, Gilles Blanchard
- Machine Learning
- 2004

The main goal of this paper is to prove inequalities on the reconstruction error for kernel principal component analysis. With respect to previous work on this topic, our contribution is twofold: (1) we give bounds that explicitly take into account the empirical centering step in this algorithm, and (2) we show that a “localized” approach allows to obtain… (More)

- Laurent Zwald, Gilles Blanchard
- NIPS
- 2005

This paper presents a non-asymptotic statistical analysis of KernelPCA with a focus different from the one proposed in previous work on this topic ([2], [9]). Here instead of considering the reconstruction error of KPCA we are interested in approximation error bounds for the eigenspaces themselves. We prove an upper bound depending on the spacing between… (More)

- Laurent Zwald, Régis Vert, Gilles Blanchard, Pascal Massart
- NIPS
- 2004

This paper investigates the effect of Kernel Principal Component Analysis (KPCA) within the classification framework, essentially the regularization properties of this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a regularization… (More)

The Huber’s Criterion is a useful method for robust regression. The adaptive least absolute shrinkage and selection operator (lasso) is a popular technique for simultaneous estimation and variable selection. The adaptive weights in the adaptive lasso allow to have the oracle properties. In this paper we propose to combine the Huber’s criterion and adaptive… (More)

- Gilles Blanchard, Laurent Zwald
- IEEE Transactions on Information Theory
- 2008

<para> In this paper, a new method for the binary classification problem is studied. It relies on empirical minimization of the hinge risk over an increasing sequence of finite-dimensional spaces. A suitable dimension is picked by minimizing the regularized risk, where the regularization term is proportional to the dimension. An oracle-type inequality is… (More)

- ‹
- 1
- ›