Corpus ID: 208513960

Auxiliary Learning for Deep Multi-task Learning

@article{Liu2019AuxiliaryLF,
  title={Auxiliary Learning for Deep Multi-task Learning},
  author={Yifan Liu and Bohan Zhuang and Chunhua Shen and Hao Chen and Wei Yin},
  journal={arXiv: Computer Vision and Pattern Recognition},
  year={2019}
}
  • Yifan Liu, Bohan Zhuang, +2 authors Wei Yin
  • Published 2019
  • Computer Science
  • arXiv: Computer Vision and Pattern Recognition
  • Multi-task learning (MTL) is an efficient solution to solve multiple tasks simultaneously in order to get better speed and performance than handling each single-task in turn. The most current methods can be categorized as either: (i) hard parameter sharing where a subset of the parameters is shared among tasks while other parameters are task-specific; or (ii) soft parameter sharing where all parameters are task-specific but they are jointly regularized. Both methods suffer from limitations: the… CONTINUE READING
    4 Citations
    A Brief Review of Deep Multi-task Learning and Auxiliary Task Learning
    Keypoint-Aligned Embeddings for Image Retrieval and Re-identification
    Switchable Precision Neural Networks
    • 1
    • PDF

    References

    SHOWING 1-10 OF 59 REFERENCES
    PAD-Net: Multi-tasks Guided Prediction-and-Distillation Network for Simultaneous Depth Estimation and Scene Parsing
    • 105
    • PDF
    Multi-Task Learning as Multi-Objective Optimization
    • 152
    • PDF
    Multi-task Learning Using Uncertainty to Weigh Losses for Scene Geometry and Semantics
    • 656
    • PDF
    Learning Multiple Tasks with Multilinear Relationship Networks
    • 89
    • PDF
    Cross-Stitch Networks for Multi-task Learning
    • 418
    • PDF
    Revisiting Multi-Task Learning with ROCK: a Deep Residual Auxiliary Block for Visual Detection
    • 17
    • PDF
    Trace Norm Regularised Deep Multi-Task Learning
    • 85
    • PDF