Preserving Semantic Relations for Zero-Shot Learning

@article{Annadani2018PreservingSR,
  title={Preserving Semantic Relations for Zero-Shot Learning},
  author={Yashas Annadani and Soma Biswas},
  journal={2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2018},
  pages={7603-7612}
}
  • Yashas Annadani, Soma Biswas
  • Published in
    IEEE/CVF Conference on…
    2018
  • Computer Science
  • Highlight Information
    Zero-shot learning has gained popularity due to its potential to scale recognition models without requiring additional training data. [...] Key Method We devise objective functions to preserve these relations in the embedding space, thereby inducing semanticity to the embedding space. Through extensive experimental evaluation on five benchmark datasets, we demonstrate that inducing semanticity to the embedding space is beneficial for zero-shot learning. The proposed approach outperforms the state-of-the-art on…Expand Abstract

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Figures, Tables, and Topics from this paper.

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 64 CITATIONS

    Domain-Specific Embedding Network for Zero-Shot Recognition

    VIEW 7 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Heterogeneous Graph-based Knowledge Transfer for Generalized Zero-shot Learning

    VIEW 5 EXCERPTS
    CITES METHODS, RESULTS & BACKGROUND
    HIGHLY INFLUENCED

    Semantically Aligned Bias Reducing Zero Shot Learning

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Towards Effective Deep Embedding for Zero-Shot Learning

    VIEW 8 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Convolutional Prototype Learning for Zero-Shot Recognition

    VIEW 5 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Enhancing Visual Embeddings through Weakly Supervised Captioning for Zero-Shot Learning

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Hierarchical Prototype Learning for Zero-Shot Recognition

    VIEW 4 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Landmark Selection for Zero-shot Learning

    VIEW 1 EXCERPT
    HIGHLY INFLUENCED

    Marginalized Latent Semantic Encoder for Zero-Shot Learning

    • Zhengming Ding, Hongfu Liu
    • Computer Science
    • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
    • 2019
    VIEW 8 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Visual Space Optimization for Zero-shot Learning

    VIEW 8 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2018
    2020

    CITATION STATISTICS

    • 15 Highly Influenced Citations

    • Averaged 26 Citations per year from 2018 through 2019

    • 264% Increase in citations per year in 2019 over 2018

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 46 REFERENCES

    Zero-Shot Learning—A Comprehensive Evaluation of the Good, the Bad and the Ugly

    VIEW 14 EXCERPTS
    HIGHLY INFLUENTIAL

    ImageNet Large Scale Visual Recognition Challenge

    VIEW 10 EXCERPTS
    HIGHLY INFLUENTIAL

    et al

    • A. Frome, G. S. Corrado, +3 authors T. Mikolov
    • Devise: A deep visual-semantic embedding model. In NIPS
    • 2013
    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    Semantic Autoencoder for Zero-Shot Learning

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Synthesized Classifiers for Zero-Shot Learning

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    SUN database: Large-scale scene recognition from abbey to zoo

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Zero-Shot Learning — The Good, the Bad and the Ugly

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    Learning a Deep Embedding Model for Zero-Shot Learning

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Zero-Shot Learning by Convex Combination of Semantic Embeddings

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL