# Algorithms for metric learning via contrastive embeddings

@inproceedings{Centurion2019AlgorithmsFM,
title={Algorithms for metric learning via contrastive embeddings},
author={Diego Ihara Centurion and Neshat Mohammadi and Anastasios Sidiropoulos},
booktitle={SoCG},
year={2019}
}
• Published in SoCG 13 July 2018
• Computer Science, Mathematics
We study the problem of supervised learning a metric space under discriminative constraints. Given a universe $X$ and sets ${\cal S}, {\cal D}\subset {X \choose 2}$ of similar and dissimilar pairs, we seek to find a mapping $f:X\to Y$, into some target metric space $M=(Y,\rho)$, such that similar objects are mapped to points at distance at most $u$, and dissimilar objects are mapped to points at distance at least $\ell$. More generally, the goal is to find a mapping of maximum accuracy (that is…
1 Citations

## Figures from this paper

Learning Lines with Ordinal Constraints
• Computer Science, Mathematics
APPROX-RANDOM
• 2020
This work studies the problem of finding a mapping from a set of points into the real line, under ordinal triple constraints, and presents an approximation algorithm for the dense case of this problem.

## References

SHOWING 1-10 OF 34 REFERENCES
Metric clustering via consistent labeling
• Computer Science
SODA '08
• 2008
This work is the first to emphasize relative guarantees, that compare the produced solution to the optimal one for the input at hand, and provides a family of linear programming relaxations and simple randomized rounding procedures that achieve provably good approximation guarantees.
Distance Metric Learning for Large Margin Nearest Neighbor Classification
• Computer Science
NIPS
• 2005
This paper shows how to learn a Mahalanobis distance metric for kNN classification from labeled examples in a globally integrated manner and finds that metrics trained in this way lead to significant improvements in kNN Classification.
Ordinal Embedding: Approximation Algorithms and Dimensionality Reduction
• Computer Science
APPROX-RANDOM
• 2008
Two polynomial-time constant-factor approximation algorithms for minimizing the relaxation in an embedding of an unweighted graph into a line metric and into a tree metric are developed.
Information-theoretic metric learning
• Computer Science
ICML '07
• 2007
An information-theoretic approach to learning a Mahalanobis distance function that can handle a wide variety of constraints and can optionally incorporate a prior on the distance function and derive regret bounds for the resulting algorithm.
Reality Distortion: Exact and Approximate Algorithms for Embedding into the Line
• Computer Science
2015 IEEE 56th Annual Symposium on Foundations of Computer Science
• 2015
Algorithms for the problem of minimum distortion embeddings of finite metric spaces into the real line (or a finite subset of the line) are described, which yields a quasipolynomial running time for constant δ, and polynomial D.
Metric Learning: A Survey
• B. Kulis
• Computer Science
Found. Trends Mach. Learn.
• 2013
Metric Learning: A Review presents an overview of existing research in this topic, including recent progress on scaling to high-dimensional feature spaces and to data sets with an extremely large number of data points.
INAPPROXIMABILITY FOR METRIC EMBEDDINGS INTO R
This work considers the problem of computing the smallest possible distortion for embedding of a given n-point metric space into Rd, where d is fixed (and small), and derives inapproximability with a factor roughly n1/(22d−10) for every fixed d ≥ 2 by a conceptually very simple reduction.
Inapproximability for metric embeddings into Rd
• Computer Science, Mathematics
• 2010
This work considers the problem of computing the smallest possible distortion for embedding of a given n-point metric space into ℝ d, where d is fixed (and small), and derives inapproximability with a factor roughly n 1/(22d-10) for every fixed d ≥ 2 by a conceptually very simple reduction.
Probabilistic approximation of metric spaces and its algorithmic applications
• Y. Bartal
• Computer Science, Mathematics
Proceedings of 37th Conference on Foundations of Computer Science
• 1996
It is proved that any metric space can be probabilistically-approximated by hierarchically well-separated trees (HST) with a polylogarithmic distortion.