Adaptive Activation Network and Functional Regularization for Efficient and Flexible Deep Multi-Task Learning

@article{Liu2020AdaptiveAN,
  title={Adaptive Activation Network and Functional Regularization for Efficient and Flexible Deep Multi-Task Learning},
  author={Yingru Liu and X. Yang and Dongliang Xie and X. Wang and L. Shen and Haozhi Huang and Niranjan Balasubramanian},
  journal={ArXiv},
  year={2020},
  volume={abs/1911.08065}
}
Multi-task learning (MTL) is a common paradigm that seeks to improve the generalization performance of task learning by training related tasks simultaneously. However, it is still a challenging problem to search the flexible and accurate architecture that can be shared among multiple tasks. In this paper, we propose a novel deep learning model called Task Adaptive Activation Network (TAAN) that can automatically learn the optimal network architecture for MTL. The main principle of TAAN is to… Expand

References

SHOWING 1-10 OF 22 REFERENCES
Latent Multi-Task Architecture Learning
Learning Multiple Tasks with Multilinear Relationship Networks
SNR: Sub-Network Routing for Flexible Parameter Sharing in Multi-Task Learning
Cross-Stitch Networks for Multi-task Learning
Convex multi-task feature learning
Self-Paced Multi-Task Learning
Deep Multi-task Representation Learning: A Tensor Factorisation Approach
ConvNets with Smooth Adaptive Activation Functions for Regression
Learning Activation Functions to Improve Deep Neural Networks
Self-Paced Multitask Learning with Shared Knowledge
...
1
2
3
...