Neural Architectures for Named Entity Recognition

  title={Neural Architectures for Named Entity Recognition},
  author={Guillaume Lample and Miguel Ballesteros and Sandeep Subramanian and Kazuya Kawakami and Chris Dyer},
Comunicacio presentada a la 2016 Conference of the North American Chapter of the Association for Computational Linguistics, celebrada a San Diego (CA, EUA) els dies 12 a 17 de juny 2016. 

Figures and Tables from this paper

Named-entity recognition in Czech historical texts : Using a CNN-BiLSTM neural network model
The thesis presents named-entity recognition in Czech historical newspapers from Modern Access to Historical Sources Project. Our goal was to create a specific corpus and annotation manual for the ...
WNUT 2020 Shared Task-1: Conditional Random Field(CRF) based Named Entity Recognition(NER) for Wet Lab Protocols
The paper describes how classifier model built using Conditional Random Field detects named entities in wet lab protocols in wet laboratory protocols.
Unnamed Entity Recognition of Sense Mentions
This work considers the problem of recognizing mentions of human senses in text and proposes a method for acquiring labeled data and a learning method that is trained on this data.
Combining rule-based and statistical mechanisms for low-resource named entity recognition
We describe a multifaceted approach to named entity recognition that can be deployed with minimal data resources and a handful of hours of non-expert annotation. We describe how this approach was
Morphological Embeddings for Named Entity Recognition in Morphologically Rich Languages
This work proposes several schemes for representing the morphological analysis of a word in the context of named entity recognition and shows that a concatenation of this representation with the word and character embeddings improves the performance.
LM-Based Word Embeddings Improve Biomedical Named Entity Recognition: A Detailed Analysis
This research evaluates the effectiveness of contextualized word embeddings in the biomedical domain under multi-task settings with real-time requirements.
Integrating Approaches to Word Representation
A survey of the distributional, compositional, and relational approaches to addressing the problem of representing the atomic elements of language in modern neural learning systems is presented, with special emphasis on the word level and the out-of-vocabulary phenomenon.
Cross-lingual Transfer Learning for Japanese Named Entity Recognition
This work explores cross-lingual transfer learning for named entity recognition, focusing on bootstrapping Japanese from English, and presents a novel approach that overcomes linguistic differences between this language pair by romanizing a portion of the Japanese input.
Effectiveness of Character Language Model for Vietnamese Named Entity Recognition
Experimental results show that the proposed character language model for Vietnamese Named Entity Recognition is the current state-of-theart end-to-end obtains for the task.
Investigation on Data Adaptation Techniques for Neural Named Entity Recognition
This work investigates the impact of large monolingual unlabeled corpora and synthetic data from the original labeled data on the performance of three different named entity recognition tasks.


Introduction to the CoNLL-2003 Shared Task: Language-Independent Named Entity Recognition
The CoNLL-2003 shared task: language-independent named entity recognition is described and a general overview of the systems that have taken part in the task and discuss their performance is presented.
Introduction to the CoNLL-2002 Shared Task: Language-Independent Named Entity Recognition
The CoNLL-2002 shared task: language-independent named entity recognition is described and a general overview of the systems that have taken part in the task and discuss their performance is presented.
Language Independent NER using a Unified Model of Internal and Contextual Evidence
This paper investigates the use of a language independent model for named entity recognition based on iterative learning in a co-training fashion, using word-internal and contextual information as
Language Independent Named Entity Recognition Combining Morphological and Contextual Evidence
A language-independent bootstrapping algorithm based on iterative learning and re-estimation of contextual and morphological patterns captured in hierarchically smoothed trie models is described and evaluated.
Natural Language Processing (Almost) from Scratch
We propose a unified neural network architecture and learning algorithm that can be applied to various natural language processing tasks including part-of-speech tagging, chunking, named entity
Lexicon Infused Phrase Embeddings for Named Entity Resolution
A new form of learning word embeddings that can leverage information from relevant lexicons to improve the representations, and the first system to use neural word embedDings to achieve state-of-the-art results on named-entity recognition in both CoNLL and Ontonotes NER are presented.
Boosting Named Entity Recognition with Neural Character Embeddings
This work proposes a language-independent NER system that uses automatically learned features only and demonstrates that the same neural network which has been successfully applied to POS tagging can also achieve state-of-the-art results for language-independet NER, using the same hyperparameters, and without any handcrafted features.
Named Entity Recognition through Classifier Combination
This paper presents a classifier-combination experimental framework for named entity recognition in which four diverse classifiers (robust linear classifier, maximum entropy, transformation-based
Named Entity Recognition with Bidirectional LSTM-CNNs
A novel neural network architecture is presented that automatically detects word- and character-level features using a hybrid bidirectional LSTM and CNN architecture, eliminating the need for most feature engineering.
Character-level Convolutional Networks for Text Classification
This article constructed several large-scale datasets to show that character-level convolutional networks could achieve state-of-the-art or competitive results in text classification.