Learning Global Features for Coreference Resolution

@inproceedings{Wiseman2016LearningGF,
  title={Learning Global Features for Coreference Resolution},
  author={Sam Wiseman and Alexander M. Rush and Stuart M. Shieber},
  booktitle={NAACL},
  year={2016}
}
There is compelling evidence that coreference prediction would benefit from modeling global information about entity-clusters. Yet, state-of-the-art performance can be achieved with systems treating each mention prediction independently, which we attribute to the inherent difficulty of crafting informative cluster-level features. We instead propose to use recurrent neural networks (RNNs) to learn latent, global representations of entity clusters directly from their mentions. We show that such… Expand
Improving Coreference Resolution by Learning Entity-Level Distributed Representations
TLDR
A neural network based coreference system that produces high-dimensional vector representations for pairs of coreference clusters that learns when combining clusters is desirable and substantially outperforms the current state of the art on the English and Chinese portions of the CoNLL 2012 Shared Task dataset. Expand
Neural Models for Reasoning over Multiple Mentions Using Coreference
Many problems in NLP require aggregating information from multiple mentions of the same entity which may be far apart in the text. Existing Recurrent Neural Network (RNN) layers are biased towardsExpand
Word Embeddings as Features for Supervised Coreference Resolution
TLDR
This work investigates whether and to what extend features derived from word embeddings can be successfully used for supervised coreference resolution, and test several different types of embeddingbased features, including embedding cluster and cosine similarity-based features. Expand
Pre-training of Mention Representations in Coreference Models
TLDR
This work proposes two self-supervised tasks that are closely related to coreference resolution and thus improve mention representation and applies this approach to the GAP dataset results in new state of the arts results. Expand
End-to-end Neural Coreference Resolution
TLDR
This work introduces the first end-to-end coreference resolution model, trained to maximize the marginal likelihood of gold antecedent spans from coreference clusters and is factored to enable aggressive pruning of potential mentions. Expand
Neural Coreference Resolution with Deep Biaffine Attention by Joint Mention Detection and Mention Clustering
TLDR
This paper proposes to improve the end-to-end coreference resolution system by using a biaffine attention model to get antecedent scores for each possible mention, and jointly optimizing the mention detection accuracy and mention clustering accuracy given the mention cluster labels. Expand
Jointly Optimized Neural Coreference Resolution with Mutual Attention
TLDR
This model is trained by jointly optimizing mention clustering and imbalanced mention detection, which enables it to detect more gold mentions in a document to make more accurate coreference decisions and achieve the state-of-the-art coreference performance compared with baselines. Expand
Revisiting Memory-Efficient Incremental Coreference Resolution
TLDR
This work explores the task of coreference resolution under fixed memory by extending an incremental clustering algorithm to utilize contextualized encoders and neural components, leading to an asymptotic reduction in memory usage while remaining competitive on task performance. Expand
A Deep Learning Framework for Coreference Resolution Based on Convolutional Neural Network
TLDR
This paper proposes convolutional neural network model to extent word embeddings to mention/antecedent representation and shows that the proposed system achieves a competitive performance compared with the state-of-the-art approaches. Expand
Coreference Resolution with Entity Equalization
TLDR
This work shows how to represent each mention in a cluster via an approximation of the sum of all mentions in the cluster in a fully differentiable end-to-end manner, thus enabling high-order inferences in the resolution process. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 50 REFERENCES
Entity-Centric Coreference Resolution with Model Stacking
TLDR
This work trains an entity-centric coreference system that learns an effective policy for building up coreference chains incrementally by aggregating the scores produced by mention pair models to define powerful entity-level features between clusters of mentions. Expand
Supervised Models for Coreference Resolution
TLDR
A cluster-ranking approach to coreference resolution that combines the strengths of mention rankers and entitymention models is proposed and Experimental results on the ACE data sets demonstrate its superior performance to competing approaches. Expand
Unsupervised Models for Coreference Resolution
TLDR
A cluster-ranking approach to coreference resolution that combines the strengths of mention rankers and entity-mention models is proposed and Experimental results on the ACE data sets demonstrate its superior performance to competing approaches. Expand
Latent Structures for Coreference Resolution
TLDR
A unified representation of different approaches to coreference resolution in terms of the structure they operate on is proposed and a systematic analysis of the output of these approaches is conducted, highlighting differences and similarities. Expand
Learning Structured Perceptrons for Coreference Resolution with Latent Antecedents and Non-local Features
TLDR
This work investigates different ways of learning structured perceptron models for coreference resolution when using non-local features and beam search and obtains the best results to date on recent shared task data for Arabic, Chinese, and English. Expand
Learning Anaphoricity and Antecedent Ranking Features for Coreference Resolution
We introduce a simple, non-linear mention-ranking model for coreference resolution that attempts to learn distinct feature representations for anaphoricity detection and antecedent ranking, which weExpand
A Constrained Latent Variable Model for Coreference Resolution
TLDR
The Latent Left Linking model (L 3 M), a novel, principled, and linguistically motivated latent structured prediction approach to coreference resolution, is described and it is shown that L 3 M admits efficient inference and can be augmented with knowledge-based constraints. Expand
Easy-first Coreference Resolution
TLDR
An approach to coreference resolution that relies on the intuition that easy decisions should be made early, while harder decisions should been left for later when more information is available, and that automatically learns from training data what constitutes an easy decision. Expand
Understanding the Value of Features for Coreference Resolution
TLDR
This paper describes a rather simple pairwise classification model for coreference resolution, developed with a well-designed set of features and shows that this produces a state-of-the-art system that outperforms systems built with complex models. Expand
Coreference Resolution in a Modular, Entity-Centered Model
TLDR
This generative, model-based approach in which each of these factors is modularly encapsulated and learned in a primarily unsu-pervised manner is presented, resulting in the best results to date on the complete end-to-end coreference task. Expand
...
1
2
3
4
5
...