Ranked List Loss for Deep Metric Learning

@article{Wang2019RankedLL,
  title={Ranked List Loss for Deep Metric Learning},
  author={Xinshao Wang and Yang Hua and Elyor Kodirov and Guosheng Hu and Romain Garnier and Neil Martin Robertson},
  journal={2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2019},
  pages={5202-5211}
}
  • Xinshao Wang, Yang Hua, +3 authors N. Robertson
  • Published 25 February 2019
  • Computer Science
  • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
The objective of deep metric learning (DML) is to learn embeddings that can capture semantic similarity information among data points. Existing pairwise or tripletwise loss functions used in DML are known to suffer from slow convergence due to a large proportion of trivial pairs or triplets as the model improves. To improve this, rankingmotivated structured losses are proposed recently to incorporate multiple examples and exploit the structured information among them. They converge faster and… 
Deep Metric Learning with Self-Supervised Ranking
TLDR
This paper presents a novel self-supervised ranking auxiliary framework, which captures intra- class characteristics as well as inter-class characteristics for better metric learning.
The General Pair-based Weighting Loss for Deep Metric Learning
TLDR
Extensive experiments on three image retrieval datasets show that the general pair-based weighting loss obtains new state-of-the-art performance, demonstrating the effectiveness of the pair- based samples mining and pairs weighting for deep metric learning.
Proxy Anchor Loss for Deep Metric Learning
TLDR
This paper presents a new proxy-based loss that takes advantages of both pair- and proxy- based methods and overcomes their limitations, and allows embedding vectors of data to interact with each other in its gradients to exploit data-to-data relations.
Distribution Structure Learning Loss (DSLL) Based on Deep Metric Learning for Image Retrieval
TLDR
A metric distance learning for highly matching figures to preserve the similarity structure inside it and an entropy weight-based structural distribution to set the weight of the representative negative samples so that the negative samples can preserve the consistency of their structural distribution.
Instance Cross Entropy for Deep Metric Learning
TLDR
This work proposes instance cross entropy (ICE), which measures the difference between an estimated instance-level matching distribution and its ground-truth one, and rescales samples' gradients to control the differentiation degree over training examples instead of truncating them by sample mining.
Deep Metric Learning with Graph Consistency
TLDR
This paper empirically and experimentally demonstrates the effectiveness of the graph regularization idea, achieving competitive results on the popular CUB, CARS, Stanford Online Products and In-Shop datasets.
Unbiased Evaluation of Deep Metric Learning Algorithms
TLDR
An unbiased comparison of the most popular DML baseline methods under same conditions and more importantly, not obfuscating any hyper parameter tuning or adjustment needed to favor a particular method is performed.
Improving Deep Metric Learning by Divide and Conquer
TLDR
This work significantly improves upon the state-of-the-art in image retrieval and clustering on CUB200-2011, CARS196, SOP, In-shop Clothes, and VehicleID datasets by jointly splitting the embedding space and the data hierarchically into smaller sub-parts.
Multi-Head Deep Metric Learning Using Global and Local Representations
TLDR
The proposed DML approach makes use of a hybrid loss by integrating the pairwise-based and the proxybased loss functions to leverage rich data-to-data relations as well as fast convergence and uses the second-order attention for feature enhancement to improve accurate and efficient retrieval.
Deep Compositional Metric Learning
In this paper, we propose a deep compositional metric learning (DCML) framework for effective and generalizable similarity measurement between images. Conventional deep metric learning methods
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 84 REFERENCES
Ranked List Loss for Deep Metric Learning.
TLDR
This work unveils two limitations of existing ranking-motivated structured losses and proposes a novel ranked list loss to solve both of them and proposes to learn a hypersphere for each class to preserve useful similarity structure inside it, which functions as regularisation.
Multi-Similarity Loss With General Pair Weighting for Deep Metric Learning
TLDR
A General Pair Weighting framework is established, which casts the sampling problem of deep metric learning into a unified view of pair weighting through gradient analysis, providing a powerful tool for understanding recent pair-based loss functions.
SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
TLDR
The SoftTriple loss is proposed to extend the SoftMax loss with multiple centers for each class, equivalent to a smoothed triplet loss where each class has a single center.
Improved Deep Metric Learning with Multi-class N-pair Loss Objective
TLDR
This paper proposes a new metric learning objective called multi-class N-pair loss, which generalizes triplet loss by allowing joint comparison among more than one negative examples and reduces the computational burden of evaluating deep embedding vectors via an efficient batch construction strategy using only N pairs of examples.
Deep Metric Learning via Lifted Structured Feature Embedding
TLDR
An algorithm for taking full advantage of the training batches in the neural network training by lifting the vector of pairwise distances within the batch to the matrix of Pairwise distances enables the algorithm to learn the state of the art feature embedding by optimizing a novel structured prediction objective on the lifted problem.
Deep Metric Learning by Online Soft Mining and Class-Aware Attention
TLDR
This work proposes a novel sample mining method, called Online Soft Mining (OSM), which assigns one continuous score to each sample to make use of all samples in the mini-batch, and introduces Class-Aware Attention (CAA) that assigns little attention to abnormal data samples.
Instance Cross Entropy for Deep Metric Learning
TLDR
This work proposes instance cross entropy (ICE), which measures the difference between an estimated instance-level matching distribution and its ground-truth one, and rescales samples' gradients to control the differentiation degree over training examples instead of truncating them by sample mining.
Deep Metric Learning with Hierarchical Triplet Loss
TLDR
A novel hierarchical triplet loss capable of automatically collecting informative training samples (triplets) via a defined hierarchical tree that encodes global context information that encourages the model to learn more discriminative features from visual similar classes, leading to faster convergence and better performance.
No Fuss Distance Metric Learning Using Proxies
TLDR
This paper proposes to optimize the triplet loss on a different space of triplets, consisting of an anchor data point and similar and dissimilar proxy points which are learned as well, and proposes a proxy-based loss which improves on state-of-art results for three standard zero-shot learning datasets.
Unbiased Evaluation of Deep Metric Learning Algorithms
TLDR
An unbiased comparison of the most popular DML baseline methods under same conditions and more importantly, not obfuscating any hyper parameter tuning or adjustment needed to favor a particular method is performed.
...
1
2
3
4
5
...