Klaus Obermayer

Learn More
Learning vector quantization (LVQ) is a popular class of adaptive nearest prototype classifiers for multiclass classification, but learning algorithms from this family have so far been proposed on heuristic grounds. Here, we take a more principled approach and derive two variants of LVQ using a gaussian mixture ansatz. We propose an objective function based(More)
Cortical computations critically involve local neuronal circuits. The computations are often invariant across a cortical area yet are carried out by networks that can vary widely within an area according to its functional architecture. Here we demonstrate a mechanism by which orientation selectivity is computed invariantly in cat primary visual cortex(More)
Orientation and ocular dominance maps in the primary visual cortex of mammals are among the most thoroughly investigated of the patterns in the cerebral cortex. A considerable amount of work has been dedicated to unraveling both their detailed structure and the neural mechanisms that underlie their formation and development. Many schemes have been proposed,(More)
Exact geometrical reconstructions of neuronal architecture are indispensable for the investigation of neuronal function. Neuronal shape is important for the wiring of networks, and dendritic architecture strongly affects neuronal integration and firing properties as demonstrated by modeling approaches. Confocal microscopy allows to scan neurons with(More)
We investigate the convergence properties of the self-organizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0,1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and Fort to hold in any case where the neighborhood function,(More)
We investigate the problem of learning a classification task on data represented in terms of their pairwise proximities. This representation does not refer to an explicit feature representation of the data items and is thus more general than the standard approach of using Euclidean feature vectors, from which pairwise proximities can always be calculated.(More)
We describe a new technique for the analysis of dyadic data, where two sets of objects (row and column objects) are characterized by a matrix of numerical values that describe their mutual relationships. The new technique, called potential support vector machine (P-SVM), is a large-margin method for the construction of classifiers and regression functions(More)
We ooer three algorithms for the generation of topographic mappings to the practitioner of unsupervised data analysis. The algorithms are each based on the minimization of a cost function which is performed using an EM algorithm and de-terministic annealing. The soft topographic vector quantization algorithm (STVQ) { like the original Self-Organizing Map(More)