# Distance Learning in Discriminative Vector Quantization

@article{Schneider2009DistanceLI, title={Distance Learning in Discriminative Vector Quantization}, author={Petra Schneider and Michael Biehl and Barbara Hammer}, journal={Neural Computation}, year={2009}, volume={21}, pages={2942-2969} }

Abstract Discriminative vector quantization schemes such as learning vector quantization (LVQ) and extensions thereof offer efficient and intuitive classifiers based on the representation of classes by prototypes. The original methods, however, rely on the Euclidean distance corresponding to the assumption that the data can be represented by isotropic clusters. For this reason, extensions of the methods to more general metric structures have been proposed, such as relevance adaptation in…

## 117 Citations

### Learning vector quantization for proximity data

- Computer Science
- 2016

A novel extension of LVQ to similarity data which is based on the kernelization of an underlying probabilistic model: kernel robust soft LVQ (KRSLVQ), relying on the notion of a pseudo-Euclidean embedding of proximity data is proposed.

### Efficient approximations of robust soft learning vector quantization for non-vectorial data

- Computer ScienceNeurocomputing
- 2015

### Regularization in Matrix Relevance Learning

- Computer ScienceIEEE Transactions on Neural Networks
- 2010

A regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ) by extending the cost function by an appropriate regularization term prevents the unfavorable behavior and can help to improve the generalization ability.

### Border-sensitive learning in generalized learning vector quantization: an alternative to support vector machines

- Computer ScienceSoft Comput.
- 2015

Two modifications of LVQ are proposed to make it comparable to SVM: first border-sensitive learning is introduced to achieve border-responsible prototypes comparable with support vectors in SVM, and kernel distances for differentiable kernels are considered, such that prototype learning takes place in a metric space isomorphic to the feature mapping space of SVM.

### University of Groningen Advanced methods for prototype-based classification

- Computer Science
- 2017

A regularization technique to extend recently proposed matrix learning schemes in Learning Vector Quantization (LVQ) by extending the cost function by an appropriate regularization term prevents the unfavorable behavior and can help to improve the generalization ability.

### Divergence-based classification in learning vector quantization

- Computer ScienceNeurocomputing
- 2011

### Median variants of learning vector quantization for learning of dissimilarity data

- Computer ScienceNeurocomputing
- 2015

### Efficient Approximations of Kernel Robust Soft LVQ

- Computer ScienceWSOM
- 2012

This contribution investigates two approximation schemes which lead to sparse models: k-approximations of the prototypes and the Nystrom approximation of the Gram matrix.

### Hyperparameter learning in probabilistic prototype-based models

- Computer ScienceNeurocomputing
- 2010

## References

SHOWING 1-10 OF 24 REFERENCES

### Adaptive Relevance Matrices in Learning Vector Quantization

- Computer ScienceNeural Computation
- 2009

We propose a new matrix learning scheme to extend relevance learning vector quantization (RLVQ), an efficient prototype-based classification algorithm, toward a general adaptive metric. By…

### Soft Learning Vector Quantization

- Computer ScienceNeural Computation
- 2003

This work derives two variants of LVQ using a gaussian mixture ansatz, proposes an objective function based on a likelihood ratio and derive a learning rule using gradient descent and provides a way to extend the algorithms of the LVQ family to different distance measure.

### Regularization in Matrix Relevance Learning

- Computer ScienceIEEE Transactions on Neural Networks
- 2010

A regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ) by extending the cost function by an appropriate regularization term prevents the unfavorable behavior and can help to improve the generalization ability.

### Divergence-based classification in learning vector quantization

- Computer ScienceNeurocomputing
- 2011

### Hyperparameter learning in probabilistic prototype-based models

- Computer ScienceNeurocomputing
- 2010

### Distance Metric Learning for Large Margin Nearest Neighbor Classification

- Computer ScienceNIPS
- 2005

This paper shows how to learn a Mahalanobis distance metric for kNN classification from labeled examples in a globally integrated manner and finds that metrics trained in this way lead to significant improvements in kNN Classification.

### Relevance determination in Learning Vector Quantization

- Computer ScienceESANN
- 2001

The method is based on Hebbian learning and introduces weighting factors of the input dimensions which are automatically adapted to the speci c problem and obtains a possibly more eAEcient classi cation and insight to the role of the data dimensions.

### Dynamic Hyperparameter Scaling Method for LVQ Algorithms

- Computer ScienceThe 2006 IEEE International Joint Conference on Neural Network Proceedings
- 2006

The relationship between values assigned to the hyperparameters, the on-line learning process, and the structure of the resulting classifier are analyzed, and an annealing method is suggested, where each hyperparameter is initially set to a large value and is then slowly decreased during learning.

### Margin Analysis of the LVQ Algorithm

- Computer ScienceNIPS
- 2002

This paper presents margin based generalization bounds that suggest that prototypes based classifiers can be more accurate then the 1-NN rule and derived a training algorithm that selects a good set of prototypes using large margin principles.