#### Filter Results:

- Full text PDF available (432)

#### Publication Year

1995

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Brain Region

#### Cell Type

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Bernhard Schölkopf, Alexander J. Smola, Klaus-Robert Müller
- Neural Computation
- 1998

A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map—for instance, the space of all possible five-pixel products in 16 × 16 images. We give… (More)

- Bernhard Schölkopf, John C. Platt, John Shawe-Taylor, Alexander J. Smola, Robert C. Williamson
- Neural Computation
- 2001

Suppose you are given some data set drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S equals some a priori specified value between 0 and 1. We propose a method to approach this problem by trying to estimate a function f… (More)

We consider the general problem of learning from labeled and unlabeled data, which is often called semi-supervised learning or transductive inference. A principled approach to semi-supervised learning is to design a classifying function which is sufficiently smooth with respect to the intrinsic structure collectively revealed by known labeled and unlabeled… (More)

- Alexander J. Smola, Bernhard Schölkopf
- Statistics and Computing
- 2004

In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications… (More)

- Olivier Chapelle, Bernhard Schölkopf, +6 authors Tom Mitchell
- 2007

All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.

A new method for performing a nonlinear form of Principal Component Analysis is proposed. By the use of integral operator kernel functions, one can eeciently compute principal components in highh dimensional feature spaces, related to input space by some nonlinear map; for instance the space of all possible dpixel products in images. We give the derivation… (More)

- Sören Sonnenburg, Gunnar Rätsch, Christin Schäfer, Bernhard Schölkopf
- Journal of Machine Learning Research
- 2006

While classical kernel-based learning algorithms are based on a single kernel, in practice it is often desirable to use multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for classification, leading to a convex quadratically constrained quadratic program. We show that it can be rewritten as a semi-infinite linear… (More)

- Bernhard Schölkopf, Alexander J. Smola, Robert C. Williamson, Peter L. Bartlett
- Neural Computation
- 2000

We describe a new class of Support Vector algorithms for regression and classiication. In these algorithms, a parameter lets one eectively control the number of Support Vectors. While this can be useful in its own right, the parametrization has the additional beneet of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy… (More)

We propose a framework for analyzing and comparing distributions, allowing us to design statistical tests to determine if two samples are drawn from different distributions. Our test statistic is the largest difference in expectations over functions in the unit ball of a reproducing kernel Hilbert space (RKHS). We present two tests based on large deviation… (More)