• Publications
  • Influence
Statistical Mechanics of On-Line Learning Under Concept Drift
We introduce a modeling framework for the investigation of on-line machine learning processes in non-stationary environments. We exemplify the approach in terms of two specific model situations: InExpand
  • 11
  • PDF
Differential privacy for learning vector quantization
Abstract Prototype-based machine learning methods such as learning vector quantisation (LVQ) offer flexible classification tools, which represent a classification in terms of typical prototypes. ThisExpand
  • 6
  • PDF
Interpretation of linear classifiers by means of feature relevance bounds
Abstract Research on feature relevance and feature selection problems goes back several decades, but the importance of these areas continues to grow as more and more data becomes available, andExpand
  • 10
Time Series Prediction for Graphs in Kernel and Dissimilarity Spaces
Graphs are a flexible and general formalism providing rich models in various important domains, such as distributed computing, intelligent tutoring systems or social network analysis. In many cases,Expand
  • 4
  • PDF
When can unlabeled data improve the learning rate?
In semi-supervised classification, one is given access both to labeled and unlabeled data. As unlabeled data is typically cheaper to acquire than labeled data, this setup becomes advantageous as soonExpand
  • 4
  • PDF
Prototype-based classifiers in the presence of concept drift: A modelling framework
We present a modelling framework for the investigation of prototype-based classifiers in non-stationary environments. Specifically, we study Learning Vector Quantization (LVQ) systems trained from aExpand
  • 3
  • PDF
Feature Relevance Bounds for Linear Classification
Biomedical applications often aim for an identification of relevant features for a given classification task, since these carry the promise of semantic insight into the underlying process. ForExpand
  • 4
  • PDF
Convergence of Multi-pass Large Margin Nearest Neighbor Metric Learning
Large margin nearest neighbor classification (LMNN) is a popular technique to learn a metric that improves the accuracy of a simple k-nearest neighbor classifier via a convex optimization scheme.Expand
  • 3
Local Reject Option for Deterministic Multi-class SVM
Classification with reject option allows classifiers to abstain from the classification of unclear cases. While it has been shown that global reject options are optimal for probabilistic classifiers,Expand
  • 3
FRI-Feature Relevance Intervals for Interpretable and Interactive Data Exploration
Most existing feature selection methods are insufficient for analytic purposes as soon as high dimensional data or redundant sensor signals are dealt with since features can be selected due toExpand
  • 2
  • PDF