Feature Generating Networks for Zero-Shot Learning

  title={Feature Generating Networks for Zero-Shot Learning},
  author={Yongqin Xian and Tobias Lorenz and Bernt Schiele and Zeynep Akata},
Suffering from the extreme training data imbalance between seen and unseen classes, most of existing state-of-theart approaches fail to achieve satisfactory results for the challenging generalized zero-shot learning task. To circumvent the need for labeled examples of unseen classes, we propose a novel generative adversarial network (GAN) that synthesizes CNN features conditioned on class-level semantic information, offering a shortcut directly from a semantic descriptor of a class to a class… CONTINUE READING
Highly Cited
This paper has 30 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.


Publications citing this paper.


Publications referenced by this paper.
Showing 1-10 of 55 references

Attribute-Based Classification for Zero-Shot Visual Object Categorization

IEEE Transactions on Pattern Analysis and Machine Intelligence • 2014
View 15 Excerpts
Highly Influenced

Generative Moment Matching Networks

View 7 Excerpts
Highly Influenced

Generative Adversarial Nets

View 7 Excerpts
Highly Influenced

StackGAN: Text to Photo-Realistic Image Synthesis with Stacked Generative Adversarial Networks

2017 IEEE International Conference on Computer Vision (ICCV) • 2017
View 6 Excerpts
Highly Influenced

Synthesized Classifiers for Zero-Shot Learning

2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) • 2016
View 3 Excerpts
Highly Influenced

DeViSE: A Deep Visual-Semantic Embedding Model

View 4 Excerpts
Highly Influenced

Generating Visual Representations for Zero-Shot Classification

2017 IEEE International Conference on Computer Vision Workshops (ICCVW) • 2017
View 1 Excerpt

Similar Papers

Loading similar papers…