Multi-task Multi-domain Representation Learning for Sequence Tagging

@article{Peng2016MultitaskMR,
  title={Multi-task Multi-domain Representation Learning for Sequence Tagging},
  author={Nanyun Peng and Mark Dredze},
  journal={CoRR},
  year={2016},
  volume={abs/1608.02689}
}
Representation learning with deep models have demonstrated success in a range of NLP. In this paper we consider its use in a multi-task multi-domain setting for sequence tagging by proposing a unified framework for learning across tasks and domains. Our model learns robust representations that yield better performance in this setting. We use shared CRFs and domain projections to allow the model to learn domain specific representations that can feed a single task specific CRF. We evaluate our… CONTINUE READING
Highly Cited
This paper has 20 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 18 times. VIEW TWEETS

Citations

Publications citing this paper.
Showing 1-10 of 15 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 51 references

End-to-end sequence labeling via bi-directional lstmcnns-crf

Ma, Hovy2016 Xuezhe Ma, Eduard Hovy
In Proceedings of Association for Computational Linguistics (ACL) • 2016

Retrofitting word vectors to semantic lexicons

Christopher D Manning
North America Chapter of Association for Computational Linguistics ( NAACL ) • 2015

Retrofitting word vectors to semantic lexicons. In North America Chapter of Association for Computational Linguistics (NAACL)

Jesse Dodge, Sujay K Jauhar, Chris Dyer, Eduard Hovy, Noah A Smith
2015
View 1 Excerpt

Similar Papers

Loading similar papers…