• Corpus ID: 13578242

Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances

  title={Large Margin Nearest Neighbor Classification using Curved Mahalanobis Distances},
  author={Frank Nielsen and Boris Muzellec and Richard Nock},
We consider the supervised classification problem of machine learning in Cayley-Klein projective geometries: We show how to learn a curved Mahalanobis metric distance corresponding to either the hyperbolic geometry or the elliptic geometry using the Large Margin Nearest Neighbor (LMNN) framework. We report on our experimental results, and further consider the case of learning a mixed curved Mahalanobis distance. Besides, we show that the Cayley-Klein Voronoi diagrams are affine, and can be… 

Figures and Tables from this paper

Clustering in Hilbert simplex geometry

Hilbert metric in the probability simplex satisfies the property of information monotonicity, and since a canonical Hilbert metric distance can be defined on any bounded convex subset of the Euclidean space, this work considers Hilbert's projective geometry of the elliptope of correlation matrices and study its clustering performances.

Clustering in Hilbert’s Projective Geometry: The Case Studies of the Probability Simplex and the Elliptope of Correlation Matrices

  • F. NielsenKe Sun
  • Mathematics, Computer Science
    Geometric Structures of Information
  • 2018
This work introduces for this clustering task a novel computationally-friendly framework for modeling the probability simplex termed Hilbert simplex geometry, and considers Hilbert’s projective geometry of the elliptope of correlation matrices and study its clustering performances.

On Balls in a Hilbert Polygonal Geometry

Hilbert geometry is a metric geometry that extends the hyperbolic Cayley-Klein geometry. In this video, we explain the shape of balls and their properties in a convex polygonal Hilbert geometry.



Classification with mixtures of curved mahalanobis metrics

It is proved that these curved Mahalanobis k-NN classifiers define piecewise linear decision boundaries, and the performance of learning those metrics within the framework of the Large Margin Nearest Neighbor (LMNN).

Beyond Mahalanobis metric: Cayley-Klein metric learning

Cayley-Klein metric is introduced into the computer vision community as a powerful metric and an alternative to the widely studied Mahalanobis metric and it is shown that besides its good characteristic in non-Euclidean space, it is a generalization of MahalanOBis metric in some specific cases.

Learning Local Invariant Mahalanobis Distances

A novel and computationally efficient way to learn a local Mahalanobis metric per datum is proposed, and it is shown how the data should be invariant to any transformation in order to improve performance.

Clustering with Bregman Divergences

This paper proposes and analyzes parametric hard and soft clustering algorithms based on a large class of distortion functions known as Bregman divergences, and shows that there is a bijection between regular exponential families and a largeclass of BRegman diverGences, that is called regular Breg man divergence.

Bregman vantage point trees for efficient nearest Neighbor Queries

The seminal vp-tree construction and search algorithms are generalized to deal with symmetrized Bregman divergences, which are commonplace in applications of content-based multimedia retrieval.

Hyperbolic Voronoi Diagrams Made Easy

  • F. NielsenR. Nock
  • Computer Science, Mathematics
    2010 International Conference on Computational Science and Its Applications
  • 2010
Two useful primitives on the hyperbolic Voronoi diagrams for designing tailored user interfaces of an image catalog browsing application in thehyperbolic disk are considered: finding nearest neighbors, and computing smallest enclosing balls.

Local Distance Functions: A Taxonomy, New Algorithms, and an Evaluation

  • D. RamananS. Baker
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2011
A taxonomy for local distance functions where most existing algorithms can be regarded as approximations of the geodesic distance defined by a metric tensor and hybrid algorithms that use a combination of techniques to ameliorate overfitting are introduced.

Further results on the hyperbolic Voronoi diagrams

The hyperbolic Voronoi diagram in the hyperboloid model is investigated and it is investigated how it reduces to a Klein-type model using central projections.

Bregman Voronoi Diagrams

A framework for defining and building Voronoi diagrams for a broad class of distance functions called Bregman divergences, which allow one to define information-theoretic Vor onoi diagrams in statistical parametric spaces based on the relative entropy of distributions.

Non-euclidean geometries : the Cayley-Klein approach

A. Cayley and F. Klein discovered in the nineteenth century that euclidean and non-euclidean geometries can be considered as mathematical structures living inside projective-metric spaces. They