Corpus ID: 213968397

Deep contextualized word embeddings from character language models for neural sequence labeling

@inproceedings{Lief2019DeepCW,
  title={Deep contextualized word embeddings from character language models for neural sequence labeling},
  author={E. Lief},
  year={2019}
}
  • E. Lief
  • Published 2019
  • Computer Science
A family of Natural Language Processing (NLP) tasks such as part-ofspeech (PoS) tagging, Named Entity Recognition (NER), and Multiword Expression (MWE) identification all involve assigning labels to sequences of words in text (sequence labeling). Most modern machine learning approaches to sequence labeling utilize word embeddings, learned representations of text, in which words with similar meanings have similar representations. Quite recently, contextualized word embeddings have garnered much… Expand
1 Citations
Assessing the Impact of Contextual Embeddings for Portuguese Named Entity Recognition
  • 11

References

SHOWING 1-10 OF 52 REFERENCES
Contextual String Embeddings for Sequence Labeling
  • 597
  • Highly Influential
  • PDF
Semi-supervised sequence tagging with bidirectional language models
  • 381
  • Highly Influential
  • PDF
Deep contextualized word representations
  • 5,630
  • Highly Influential
  • PDF
context2vec: Learning Generic Context Embedding with Bidirectional LSTM
  • 304
  • Highly Influential
  • PDF
Enhanced word embedding with multiple prototypes
  • 2
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
  • 1,595
  • PDF
Natural Language Processing (Almost) from Scratch
  • 5,943
  • PDF
...
1
2
3
4
5
...