End-to-end Neural Coreference Resolution

@inproceedings{Lee2017EndtoendNC,
  title={End-to-end Neural Coreference Resolution},
  author={Kenton Lee and Luheng He and Mike Lewis and Luke Zettlemoyer},
  booktitle={EMNLP},
  year={2017}
}
We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector. [] Key Method The model computes span embeddings that combine context-dependent boundary representations with a head-finding attention mechanism. It is trained to maximize the marginal likelihood of gold antecedent spans from coreference clusters and is factored to enable aggressive pruning of potential mentions…

Figures and Tables from this paper

Neural End-to-end Coreference Resolution for German in Different Domains

TLDR
In an effort to support datasets representing the domains of both news and literature, this work makes use of two distinct model architectures: a mention linking-based and an incremental entity-based approach that should scale to very long documents such as literary works.

Fast End-to-end Coreference Resolution for Korean

TLDR
A BERT-SRU-based Pointer Networks that leverages the linguistic property of head-final languages to reduce the coreference linking search space and achieves 2x speedup in document processing time.

Neural Coreference Resolution with Deep Biaffine Attention by Joint Mention Detection and Mention Clustering

TLDR
This paper proposes to improve the end-to-end coreference resolution system by using a biaffine attention model to get antecedent scores for each possible mention, and jointly optimizing the mention detection accuracy and mention clustering accuracy given the mention cluster labels.

End-To-End Neural Coreference Resolution Revisited: A Simple Yet Effective Baseline

  • T. LaiTrung BuiDoo Soon Kim
  • Computer Science
    ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2022
TLDR
This work provides evidence for the necessity of carefully justifying the complexity of existing or newly proposed models, as introducing a conceptual or practical simplification to an existing model can still yield competitive results.

A Study on Improving End-to-End Neural Coreference Resolution

TLDR
A coreference cluster modification algorithm is introduced, which can help modify thecoreference cluster to rule out the dissimilar mention in the cluster and reduce errors caused by the global inconsistence of coreference clusters.

Coreference Resolution without Span Representations

TLDR
A lightweight end-to-end coreference model that removes the dependency on span representations, handcrafted features, and heuristics and performs competitively with the current standard model, while being simpler and more efficient.

BERT for Coreference Resolution

TLDR
This paper explores the effect of incorporating bidirectional encoder representations from transformers (BERT) into two architectures for coreference a rule-based heuristic and a mention-ranking model and provides significant code for utilizing BERT in an end-to-end clustering coreference model.

Scaling Within Document Coreference to Long Texts

TLDR
This paper proposes an approximation to end-to-end models which scales gracefully to documents of any length, and reduces the time/memory complexity via token windows and nearest neighbor sparsification methods for more efficient antecedent prediction.

End-to-End Neural Event Coreference Resolution

Incremental Neural Coreference Resolution in Constant Memory

TLDR
This work successfully convert a high-performing model (Joshi et al., 2020), asymptotically reducing its memory usage to constant space with only a 0.3% relative loss in F1 on OntoNotes 5.0.
...

References

SHOWING 1-10 OF 28 REFERENCES

Entity-Centric Coreference Resolution with Model Stacking

TLDR
This work trains an entity-centric coreference system that learns an effective policy for building up coreference chains incrementally by aggregating the scores produced by mention pair models to define powerful entity-level features between clusters of mentions.

Improving Coreference Resolution by Learning Entity-Level Distributed Representations

TLDR
A neural network based coreference system that produces high-dimensional vector representations for pairs of coreference clusters that learns when combining clusters is desirable and substantially outperforms the current state of the art on the English and Chinese portions of the CoNLL 2012 Shared Task dataset.

Learning Global Features for Coreference Resolution

TLDR
RNNs are proposed to be used to learn latent, global representations of entity clusters directly from their mentions, which are especially useful for the prediction of pronominal mentions, and can be incorporated into an end-to-end coreference system that outperforms the state of the art without requiring any additional search.

Latent Structures for Coreference Resolution

TLDR
A unified representation of different approaches to coreference resolution in terms of the structure they operate on is proposed and a systematic analysis of the output of these approaches is conducted, highlighting differences and similarities.

Coreference Resolution in a Modular, Entity-Centered Model

TLDR
This generative, model-based approach in which each of these factors is modularly encapsulated and learned in a primarily unsu-pervised manner is presented, resulting in the best results to date on the complete end-to-end coreference task.

Learning Structured Perceptrons for Coreference Resolution with Latent Antecedents and Non-local Features

TLDR
This work investigates different ways of learning structured perceptron models for coreference resolution when using non-local features and beam search and obtains the best results to date on recent shared task data for Arabic, Chinese, and English.

Understanding the Value of Features for Coreference Resolution

TLDR
This paper describes a rather simple pairwise classification model for coreference resolution, developed with a well-designed set of features and shows that this produces a state-of-the-art system that outperforms systems built with complex models.

A Multi-Pass Sieve for Coreference Resolution

TLDR
This work proposes a simple coreference architecture based on a sieve that applies tiers of deterministic coreference models one at a time from highest to lowest precision, and outperforms many state-of-the-art supervised and unsupervised models on several standard corpora.

Deep Reinforcement Learning for Mention-Ranking Coreference Models

TLDR
This paper applies reinforcement learning to directly optimize a neural mention-ranking model for coreference evaluation metrics, resulting in significant improvements over the current state-of-the-art on the English and Chinese portions of the CoNLL 2012 Shared Task.

Latent Structure Perceptron with Feature Induction for Unrestricted Coreference Resolution

TLDR
A machine learning system based on large margin structure perceptron for unrestricted coreference resolution that introduces two key modeling techniques: latent coreference trees and entropy guided feature induction that achieves high performances with a linear learning algorithm.