Corpus ID: 52010710

Contextual String Embeddings for Sequence Labeling

@inproceedings{Akbik2018ContextualSE,
  title={Contextual String Embeddings for Sequence Labeling},
  author={A. Akbik and Duncan A. J. Blythe and Roland Vollgraf},
  booktitle={COLING},
  year={2018}
}
Recent advances in language modeling using recurrent neural networks have made it viable to model language as distributions over characters. By learning to predict the next character on the basis of previous characters, such models have been shown to automatically internalize linguistic concepts such as words, sentences, subclauses and even sentiment. In this paper, we propose to leverage the internal states of a trained character language model to produce a novel type of word embedding which… Expand
596 Citations
Pooled Contextualized Embeddings for Named Entity Recognition
  • 131
  • PDF
Deep contextualized word embeddings from character language models for neural sequence labeling
  • 1
  • Highly Influenced
Beyond Context: A New Perspective for Word Embeddings
  • PDF
Pre-trained Contextualized Character Embeddings Lead to Major Improvements in Time Normalization: a Detailed Analysis
  • 3
  • Highly Influenced
  • PDF
HinFlair: pre-trained contextual string embeddings for pos tagging and text classification in the Hindi language
  • Highly Influenced
  • PDF
Locally-Contextual Nonlinear CRFs for Sequence Labeling
  • Highly Influenced
  • PDF
A Comparative Study on Word Embeddings in Deep Learning for Text Classification
A Forgotten Strategy for Pooled Contextualized Embedding Learning
  • Highly Influenced
Word Embedding Evaluation in Downstream Tasks and Semantic Analogies
  • PDF
Deep Contextualized Word Embeddings for Universal Dependency Parsing
  • PDF
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 34 REFERENCES
Deep contextualized word representations
  • 5,630
  • Highly Influential
  • PDF
Semi-supervised sequence tagging with bidirectional language models
  • 381
  • Highly Influential
  • PDF
Dependency Based Embeddings for Sentence Classification Tasks
  • 96
  • PDF
Empower Sequence Labeling with Task-Aware Neural Language Model
  • 201
  • PDF
Character-Aware Neural Language Models
  • 1,301
  • Highly Influential
  • PDF
Learning Word Vectors for 157 Languages
  • 635
  • PDF
Glove: Global Vectors for Word Representation
  • 17,484
  • Highly Influential
  • PDF
Sequence to Sequence Learning with Neural Networks
  • 12,109
  • PDF
Learning to Generate Reviews and Discovering Sentiment
  • 310
  • PDF
Exploring the Limits of Language Modeling
  • 815
  • Highly Influential
  • PDF
...
1
2
3
4
...