A Non-isotropic Probabilistic Take on Proxy-based Deep Metric Learning

  title={A Non-isotropic Probabilistic Take on Proxy-based Deep Metric Learning},
  author={Michael Kirchhof and Karsten Roth and Zeynep Akata and Enkelejda Kasneci},
. Proxy-based Deep Metric Learning (DML) learns deep representations by embedding images close to their class representatives ( proxies ), commonly with respect to the angle between them. However, this disregards the embedding norm, which can carry additional beneficial context such as class- or image-intrinsic uncertainty. In addition, proxy-based DML struggles to learn class-internal structures. To address both issues at once, we introduce non-isotropic probabilistic proxy-based DML. We model… 



Non-isotropy Regularization for Proxy-based Deep Metric Learning

Non-isotropy regularization ( NIR) is proposed for proxy-based Deep Metric Learning, which allows the superior convergence properties ofproxy-based methods to still be retained or even improved, making NIR very attractive for practical usage.

Making Classification Competitive for Deep Metric Learning

It is demonstrated that a standard classification network can be transformed into a variant of proxy-based metric learning that is competitive against non-parametric approaches across a wide variety of image retrieval tasks and can learn high-dimensional binary embeddings that achieve new state-of-the-art performance.

Unsupervised Deep Metric Learning via Orthogonality Based Probabilistic Loss

This paper proposes an unsupervised learning approach that learns a metric without making use of class labels, and proposes a probabilistic loss function that minimizes the chances of each triplet violating an angular constraint.

Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies

A novel Proxy-based deep Graph Metric Learning (ProxyGML) approach from the perspective of graph classification, which uses fewer proxies yet achieves better comprehensive performance and a novel reverse label propagation algorithm, by which a discriminative metric space can be learned during the process of subgraph classification.

Proxy Anchor Loss for Deep Metric Learning

This paper presents a new proxy-based loss that takes advantages of both pair- and proxy- based methods and overcomes their limitations, and allows embedding vectors of data to interact with each other in its gradients to exploit data-to-data relations.

Deep Variational Metric Learning

This paper proposes a deep variational metric learning (DVML) framework to explicitly model the intra-class variance and disentangle the intra -class invariance, namely, the class centers, and can simultaneously generate discriminative samples to improve robustness.

MIC: Mining Interclass Characteristics for Improved Metric Learning

This work proposes a novel surrogate task to learn visual characteristics shared across classes with a separate encoder, trained jointly with the encoder for class information by reducing their mutual information.

Deep Compositional Metric Learning

A deep compositional metric learning (DCML) framework for effective and generalizable similarity measurement between images is proposed and a set of learnable compositors are employed to combine the sub-embeddings and use a self-reinforced loss to train the compositors, which serve as relays to distribute the diverse training signals to avoid destroying the discrimination ability.

Metric Learning With HORDE: High-Order Regularizer for Deep Embeddings

This paper tackles the scattering problem with a distribution-aware regularization named HORDE, which enforces visually-close images to have deep features with the same distribution which are well localized in the feature space.

Deep Relational Metric Learning

A deep relational metric learning (DRML) framework for image clustering and retrieval that adaptively learns an ensemble of features that characterizes an image from different aspects to model both interclass and intraclass distributions.