Information-theoretic metric learning

@inproceedings{Davis2007InformationtheoreticML,
  title={Information-theoretic metric learning},
  author={Jason V. Davis and Brian Kulis and Prateek Jain and Suvrit Sra and Inderjit S. Dhillon},
  booktitle={ICML},
  year={2007}
}
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance function. We formulate the problem as that of minimizing the differential relative entropy between two multivariate Gaussians under constraints on the distance function. We express this problem as a particular Bregman optimization problem---that of minimizing the LogDet divergence subject to linear constraints. Our resulting algorithm has several advantages over existing methods. First, our method can… CONTINUE READING
Highly Influential
This paper has highly influenced 324 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 1,958 citations. REVIEW CITATIONS
1,093 Citations
6 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 1,093 extracted citations

1,958 Citations

0100200300'09'12'15'18
Citations per Year
Semantic Scholar estimates that this publication has 1,958 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…