Zero-Shot Task Transfer

@article{Pal2019ZeroShotTT,
  title={Zero-Shot Task Transfer},
  author={Arghya Pal and V. Balasubramanian},
  journal={2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2019},
  pages={2184-2193}
}
  • Arghya Pal, V. Balasubramanian
  • Published 2019
  • Computer Science
  • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • In this work, we present a novel meta-learning algorithm that regresses model parameters for novel tasks for which no ground truth is available (zero-shot tasks. [...] Key Result To the best of our knowledge, this is the first such effort on zero-shot learning in the task space.Expand Abstract

    Figures, Tables, and Topics from this paper.

    Transforming task representations to allow deep learning models to perform novel tasks
    2
    A Meta-Learning Framework for Generalized Zero-Shot Learning
    4
    LSM: Learning Subspace Minimization for Low-Level Vision
    1
    Learning Across Tasks and Domains
    3
    Knowledge Transfer in Vision Recognition
    Bowtie Networks: Generative Modeling for Joint Few-Shot Recognition and Novel-View Synthesis
    Zero-shot task adaptation by homoiconic meta-mapping
    1

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 47 REFERENCES
    Using Task Features for Zero-Shot Knowledge Transfer in Lifelong Learning
    59
    Zero-Shot Task Generalization with Multi-Task Deep Reinforcement Learning
    124
    Taskonomy: Disentangling Task Transfer Learning
    314
    Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
    2092
    Learning to Model the Tail
    124
    Learning Transferrable Representations for Unsupervised Domain Adaptation
    134
    Marr Revisited: 2D-3D Alignment via Surface Normal Prediction
    161
    Convex multi-task feature learning
    1208
    RoomNet: End-to-End Room Layout Estimation
    69