Deep multi-task learning with low level tasks supervised at lower layers

@inproceedings{Sgaard2016DeepML,
  title={Deep multi-task learning with low level tasks supervised at lower layers},
  author={Anders S{\o}gaard and Yoav Goldberg},
  booktitle={ACL},
  year={2016}
}
  • Anders Søgaard, Yoav Goldberg
  • Published in ACL 2016
  • Computer Science
  • In all previous work on deep multi-task learning we are aware of, all task supervisions are on the same (outermost) layer. [...] Key Result Finally, we also show how this architecture can be used for domain adaptation.Expand Abstract

    Tables and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 277 CITATIONS

    A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    SC-LSTM: Learning Task-Specific Representations in Multi-Task Learning for Sequence Labeling

    VIEW 4 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    Latent Multi-Task Architecture Learning

    VIEW 5 EXCERPTS
    CITES BACKGROUND

    Training Complex Models with Multi-Task Weak Supervision

    VIEW 1 EXCERPT
    CITES METHODS

    Hierarchical Multi Task Learning With CTC

    VIEW 1 EXCERPT
    CITES BACKGROUND

    FILTER CITATIONS BY YEAR

    2015
    2020

    CITATION STATISTICS

    • 15 Highly Influenced Citations

    • Averaged 77 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 19 REFERENCES