Neural Architectures for Named Entity Recognition

@inproceedings{Lample2016NeuralAF,
  title={Neural Architectures for Named Entity Recognition},
  author={Guillaume Lample and Miguel Ballesteros and Sandeep Subramanian and Kazuya Kawakami and Chris Dyer},
  booktitle={HLT-NAACL},
  year={2016}
}
State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available. In this paper, we introduce two new neural architectures---one based on bidirectional LSTMs and conditional random fields, and the other that constructs and labels segments using a transition-based approach inspired by shift-reduce parsers. Our models rely on two sources of information… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 1,033 CITATIONS

Deep Dominance - How to Properly Compare Deep Neural Models

VIEW 6 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Named entity recognition for Polish

Michał Marcińczuk, Aleksander Wawer
  • 2019
VIEW 4 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Sequence Time Expression Recognition in the Spanish Clinical Narrative

  • 2019 IEEE 32nd International Symposium on Computer-Based Medical Systems (CBMS)
  • 2019
VIEW 9 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Automatic Misogyny Identification Using Neural Networks

VIEW 9 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 260 Highly Influenced Citations

  • Averaged 320 Citations per year from 2017 through 2019