Corpus ID: 221761350

S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning

@article{Roth2020S2SDSS,
  title={S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning},
  author={Karsten Roth and Timo Milbich and Bjorn Ommer and Joseph Paul Cohen and Marzyeh Ghassemi},
  journal={ArXiv},
  year={2020},
  volume={abs/2009.08348}
}
  • Karsten Roth, Timo Milbich, +2 authors Marzyeh Ghassemi
  • Published 2020
  • Computer Science
  • ArXiv
  • Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot retrieval applications by learning generalizing embedding spaces, although recent work in DML has shown strong performance saturation across training objectives. However, generalization capacity is known to scale with the embedding space dimensionality. Unfortunately, high dimensional embeddings also create higher retrieval cost for downstream applications. To remedy this, we propose S2SD - Simultaneous… CONTINUE READING

    References

    SHOWING 1-10 OF 72 REFERENCES
    Revisiting Training Strategies and Generalization Performance in Deep Metric Learning
    • 15
    • PDF
    Data-Efficient Ranking Distillation for Image Retrieval
    • 1
    • PDF
    DarkRank: Accelerating Deep Metric Learning via Cross Sample Similarities Transfer
    • 85
    • PDF
    Deep Distillation Metric Learning
    • 2
    Deep Metric Learning with BIER: Boosting Independent Embeddings Robustly
    • 63
    • PDF
    Self-supervised Knowledge Distillation for Few-shot Learning
    • 1
    • PDF
    Deep Metric Learning via Lifted Structured Feature Embedding
    • 466
    • Highly Influential
    • PDF
    Metric Learning With HORDE: High-Order Regularizer for Deep Embeddings
    • 16
    • PDF
    Proxy Anchor Loss for Deep Metric Learning
    • 11
    • Highly Influential
    • PDF
    Attention-based Ensemble for Deep Metric Learning
    • 93
    • PDF