Semi-supervised sequence tagging with bidirectional language models

@inproceedings{Peters2017SemisupervisedST,
  title={Semi-supervised sequence tagging with bidirectional language models},
  author={Matthew E. Peters and Waleed Ammar and Chandra Bhagavatula and Russell Power},
  booktitle={ACL},
  year={2017}
}
Pre-trained word embeddings learned from unlabeled text have become a standard component of neural network architectures for NLP tasks. However, in most cases, the recurrent network that operates on word-level representations to produce context sensitive representations is trained on relatively little labeled data. In this paper, we demonstrate a general semi-supervised approach for adding pretrained context embeddings from bidirectional language models to NLP systems and apply it to sequence… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 92 CITATIONS, ESTIMATED 37% COVERAGE

Contextual String Embeddings for Sequence Labeling

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling

  • EMNLP
  • 2018
VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Deep contextualized word representations

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Design Challenges and Misconceptions in Neural Sequence Labeling

VIEW 8 EXCERPTS
CITES BACKGROUND, METHODS & RESULTS
HIGHLY INFLUENCED

Empower Sequence Labeling with Task-Aware Neural Language Model

  • AAAI
  • 2018
VIEW 11 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Language Model Pre-training for Hierarchical Document Representations

VIEW 9 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

A Deep Semantic Natural Language Processing Platform

VIEW 4 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

A Survey on Deep Learning for Named Entity Recognition

VIEW 6 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2017
2019

CITATION STATISTICS

  • 11 Highly Influenced Citations

  • Averaged 37 Citations per year over the last 3 years

  • 89% Increase in citations per year in 2018 over 2017

References

Publications referenced by this paper.
SHOWING 1-10 OF 45 REFERENCES

Named Entity Recognition with Bidirectional LSTM-CNNs

  • Transactions of the Association for Computational Linguistics
  • 2016
VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Natural Language Processing (almost) from Scratch

  • Journal of Machine Learning Research
  • 2011
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Similar Papers

Loading similar papers…