Learn More
This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from the one proposed in previous work on this topic ([2], [9]). Here instead of considering the reconstruction error of KPCA we are interested in approximation error bounds for the eigenspaces themselves. We prove an upper bound depending on the spacing between(More)
The main goal of this paper is to prove inequalities on the reconstruction error for Kernel Principal Component Analysis. With respect to previous work on this topic, our contribution is twofold: (1) we give bounds that explicitly take into account the empirical centering step in this algorithm, and (2) we show that a " localized " approach allows to obtain(More)
The Huber's Criterion is a useful method for robust regression. The adaptive least absolute shrinkage and selection operator (lasso) is a popular technique for simultaneous estimation and variable selection. The adaptive weights in the adaptive lasso allow to have the oracle properties. In this paper we propose to combine the Huber's criterion and adaptive(More)
This paper investigates the effect of Kernel Principal Component Analysis (KPCA) within the classification framework, essentially the regular-ization properties of this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a reg-ularization(More)
  • 1