Semi-supervised sequence tagging with bidirectional language models

@inproceedings{Peters2017SemisupervisedST,
  title={Semi-supervised sequence tagging with bidirectional language models},
  author={Matthew E. Peters and Waleed Ammar and Chandra Bhagavatula and Russell Power},
  booktitle={ACL},
  year={2017}
}
Pre-trained word embeddings learned from unlabeled text have become a standard component of neural network architectures for NLP tasks. [...] Key Result We evaluate our model on two standard datasets for named entity recognition (NER) and chunking, and in both cases achieve state of the art results, surpassing previous systems that use other forms of transfer or joint learning with additional labeled data and task specific gazetteers.Expand Abstract

Citations

Publications citing this paper.
SHOWING 1-10 OF 220 CITATIONS

Contextual String Embeddings for Sequence Labeling

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling

VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Gated Task Interaction Framework for Multi-task Sequence Tagging

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Learning to select pseudo labels: a semi-supervised method for named entity recognition

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Multi-Grained Named Entity Recognition

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

SC-LSTM: Learning Task-Specific Representations in Multi-Task Learning for Sequence Labeling

VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

THU_NGN at SemEval-2019 Task 12: Toponym Detection and Disambiguation on Scientific Papers

VIEW 4 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2017
2020

CITATION STATISTICS

  • 28 Highly Influenced Citations

  • Averaged 72 Citations per year from 2017 through 2019

  • 79% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 45 REFERENCES

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Named Entity Recognition with Bidirectional LSTM-CNNs

VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Natural Language Processing (Almost) from Scratch

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL