#### Filter Results:

- Full text PDF available (40)

#### Publication Year

2001

2016

- This year (0)
- Last 5 years (3)
- Last 10 years (27)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Sören Sonnenburg, Gunnar Rätsch, Christin Schäfer, Bernhard Schölkopf
- Journal of Machine Learning Research
- 2006

While classical kernel-based learning algorithms are based on a single kernel, in practice it is often desirable to use multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for classification, leading to a convex quadratically constrained quadratic program. We show that it can be rewritten as a semi-infinite linear… (More)

Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown. Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability. Unfortunately, `1-norm MKL is hardly observed to outperform trivial baselines in practical applications. To allow for… (More)

Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal… (More)

- Vojtech Franc, Sören Sonnenburg
- ICML
- 2008

We have developed a new Linear Support Vector Machine (SVM) training algorithm called OCAS. Its computational effort scales linearly with the sample size. In an extensive empirical evaluation OCAS significantly outperforms current state of the art SVM solvers, like SVM<sup>light</sup>, SVM<sup>perf</sup> and BMRM, achieving speedups of over 1,000 on some… (More)

- Asa Ben-Hur, Cheng Soon Ong, Sören Sonnenburg, Bernhard Schölkopf, Gunnar Rätsch
- PLoS Computational Biology
- 2008

The increasing wealth of biological data coming from a large variety of platforms and the continued development of new high-throughput methods for probing biological systems require increasingly more sophisticated computational approaches. Putting all these data in simple to use databases is a first step; but realizing the full potential of the data… (More)

- Gunnar Rätsch, Sören Sonnenburg, Bernhard Schölkopf
- ISMB
- 2005

MOTIVATION
Eukaryotic pre-mRNAs are spliced to form mature mRNA. Pre-mRNA alternative splicing greatly increases the complexity of gene expression. Estimates show that more than half of the human genes and at least one-third of the genes of less complex organisms, such as nematodes or flies, are alternatively spliced. In this work, we consider one major… (More)

Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown. Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability and scalability. Unfortunately, this `1-norm MKL is rarely observed to outperform trivial baselines in practical… (More)

- Sören Sonnenburg, Alexander Zien, Gunnar Rätsch
- ISMB
- 2006

UNLABELLED
We develop new methods for finding transcription start sites (TSS) of RNA Polymerase II binding genes in genomic DNA sequences. Employing Support Vector Machines with advanced sequence kernels, we achieve drastically higher prediction accuracies than state-of-the-art methods.
MOTIVATION
One of the most important features of genomic DNA are the… (More)

- Sören Sonnenburg, Gunnar Rätsch, Christin Schäfer
- NIPS
- 2005

While classical kernel-based learning algorithms are based on a single kernel, in practice it is often desirable to use multiple kernels. Lankriet et al. (2004) considered conic combinations of kernel matrices for classification, leading to a convex quadratically constraint quadratic program. We show that it can be rewritten as a semi-infinite linear… (More)

- Vojtech Franc, Sören Sonnenburg
- Journal of Machine Learning Research
- 2009

We have developed an optimized cutting plane algorithm (OCA) for solving large-scale risk minimization problems. We prove that the number of iterations OCA requires to converge to a ε precise solution is approximately linear in the sample size. We also derive OCAS, an OCA-based linear binary Support Vector Machine (SVM) solver, and OCAM, a linear… (More)