End-to-end Neural Coreference Resolution

@inproceedings{Lee2017EndtoendNC,
  title={End-to-end Neural Coreference Resolution},
  author={Kenton Lee and Luheng He and Mike Lewis and Luke Zettlemoyer},
  booktitle={EMNLP},
  year={2017}
}
We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector. [...] Key Method The model computes span embeddings that combine context-dependent boundary representations with a head-finding attention mechanism. It is trained to maximize the marginal likelihood of gold antecedent spans from coreference clusters and is factored to enable aggressive pruning of potential mentions…Expand
Neural End-to-end Coreference Resolution for German in Different Domains
We apply neural coreference resolution to German, surpassing the previous state-of-theart performance by a wide margin of 10– 30 points F1 across three established datasets for German. This isExpand
Fast End-to-end Coreference Resolution for Korean
TLDR
A BERT-SRU-based Pointer Networks that leverages the linguistic property of head-final languages to reduce the coreference linking search space and achieves 2x speedup in document processing time. Expand
Neural Coreference Resolution with Deep Biaffine Attention by Joint Mention Detection and Mention Clustering
TLDR
This paper proposes to improve the end-to-end coreference resolution system by using a biaffine attention model to get antecedent scores for each possible mention, and jointly optimizing the mention detection accuracy and mention clustering accuracy given the mention cluster labels. Expand
End-to-end Neural Coreference Resolution Revisited: A Simple yet Effective Baseline
TLDR
This work provides evidence for the necessity of carefully justifying the complexity of existing or newly proposed models, as introducing a conceptual or practical simplification to an existing model can still yield competitive results. Expand
A Study on Improving End-to-End Neural Coreference Resolution
TLDR
A coreference cluster modification algorithm is introduced, which can help modify thecoreference cluster to rule out the dissimilar mention in the cluster and reduce errors caused by the global inconsistence of coreference clusters. Expand
Coreference Resolution without Span Representations
TLDR
This work introduces a lightweight coreference model that removes the dependency on span representations, handcrafted features, and heuristics, and performs competitively with the current endto-end model, while being simpler and more efficient. Expand
BERT for Coreference Resolution
Several downstream NLP tasks including knowledge extraction hinge on effective coreference resolution, the task of determining which noun phrases in text refer to the same real-world entity. In thisExpand
Understanding Mention Detector-Linker Interaction for Neural Coreference Resolution
TLDR
This work dissects the best instantiation of the mainstream end-to-end coreference resolution model that underlies most current best-performing coreference systems, and empirically analyze the behavior of its two components: the mention detector and mention linker. Expand
Scaling Within Document Coreference to Long Texts
TLDR
This paper proposes an approximation to end-to-end models which scales gracefully to documents of any length, and reduces the time/memory complexity via token windows and nearest neighbor sparsification methods for more efficient antecedent prediction. Expand
End-to-End Neural Event Coreference Resolution
TLDR
An End-to-End Event Coreference approach -- E3C neural network, which can jointly model event detection and event coreference resolution tasks, and learn to extract features from raw text automatically, is proposed. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 28 REFERENCES
Entity-Centric Coreference Resolution with Model Stacking
TLDR
This work trains an entity-centric coreference system that learns an effective policy for building up coreference chains incrementally by aggregating the scores produced by mention pair models to define powerful entity-level features between clusters of mentions. Expand
Improving Coreference Resolution by Learning Entity-Level Distributed Representations
TLDR
A neural network based coreference system that produces high-dimensional vector representations for pairs of coreference clusters that learns when combining clusters is desirable and substantially outperforms the current state of the art on the English and Chinese portions of the CoNLL 2012 Shared Task dataset. Expand
Learning Global Features for Coreference Resolution
TLDR
RNNs are proposed to be used to learn latent, global representations of entity clusters directly from their mentions, which are especially useful for the prediction of pronominal mentions, and can be incorporated into an end-to-end coreference system that outperforms the state of the art without requiring any additional search. Expand
Latent Structures for Coreference Resolution
TLDR
A unified representation of different approaches to coreference resolution in terms of the structure they operate on is proposed and a systematic analysis of the output of these approaches is conducted, highlighting differences and similarities. Expand
Coreference Resolution in a Modular, Entity-Centered Model
TLDR
This generative, model-based approach in which each of these factors is modularly encapsulated and learned in a primarily unsu-pervised manner is presented, resulting in the best results to date on the complete end-to-end coreference task. Expand
Learning Structured Perceptrons for Coreference Resolution with Latent Antecedents and Non-local Features
TLDR
This work investigates different ways of learning structured perceptron models for coreference resolution when using non-local features and beam search and obtains the best results to date on recent shared task data for Arabic, Chinese, and English. Expand
Understanding the Value of Features for Coreference Resolution
TLDR
This paper describes a rather simple pairwise classification model for coreference resolution, developed with a well-designed set of features and shows that this produces a state-of-the-art system that outperforms systems built with complex models. Expand
A Multi-Pass Sieve for Coreference Resolution
TLDR
This work proposes a simple coreference architecture based on a sieve that applies tiers of deterministic coreference models one at a time from highest to lowest precision, and outperforms many state-of-the-art supervised and unsupervised models on several standard corpora. Expand
Deep Reinforcement Learning for Mention-Ranking Coreference Models
TLDR
This paper applies reinforcement learning to directly optimize a neural mention-ranking model for coreference evaluation metrics, resulting in significant improvements over the current state-of-the-art on the English and Chinese portions of the CoNLL 2012 Shared Task. Expand
Latent Structure Perceptron with Feature Induction for Unrestricted Coreference Resolution
TLDR
A machine learning system based on large margin structure perceptron for unrestricted coreference resolution that introduces two key modeling techniques: latent coreference trees and entropy guided feature induction that achieves high performances with a linear learning algorithm. Expand
...
1
2
3
...