Higher-Order Coreference Resolution with Coarse-to-Fine Inference

@inproceedings{Lee2018HigherOrderCR,
  title={Higher-Order Coreference Resolution with Coarse-to-Fine Inference},
  author={Kenton Lee and Luheng He and Luke Zettlemoyer},
  booktitle={North American Chapter of the Association for Computational Linguistics},
  year={2018}
}
We introduce a fully differentiable approximation to higher-order inference for coreference resolution. [] Key Method This enables the model to softly consider multiple hops in the predicted clusters. To alleviate the computational cost of this iterative process, we introduce a coarse-to-fine approach that incorporates a less accurate but more efficient bilinear factor, enabling more aggressive pruning without hurting accuracy. Compared to the existing state-of-the-art span-ranking approach, our model…

Figures and Tables from this paper

Revealing the Myth of Higher-Order Inference in Coreference Resolution

This paper implements an end-to-end coreference system as well as four HOI approaches, attended antecedent, entity equalization, span clustering, and cluster merging, where the latter two are the original methods.

Coreference Resolution with Entity Equalization

This work shows how to represent each mention in a cluster via an approximation of the sum of all mentions in the cluster in a fully differentiable end-to-end manner, thus enabling high-order inferences in the resolution process.

Scaling Within Document Coreference to Long Texts

This paper proposes an approximation to end-to-end models which scales gracefully to documents of any length, and reduces the time/memory complexity via token windows and nearest neighbor sparsification methods for more efficient antecedent prediction.

Coreference Resolution without Span Representations

A lightweight end-to-end coreference model that removes the dependency on span representations, handcrafted features, and heuristics and performs competitively with the current standard model, while being simpler and more efficient.

End-To-End Neural Coreference Resolution Revisited: A Simple Yet Effective Baseline

  • T. LaiTrung BuiDoo Soon Kim
  • Computer Science
    ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2022
This work provides evidence for the necessity of carefully justifying the complexity of existing or newly proposed models, as introducing a conceptual or practical simplification to an existing model can still yield competitive results.

On Generalization in Coreference Resolution

It is found that in a zero-shot setting, models trained on a single dataset transfer poorly while joint training yields improved overall performance, leading to better generalization in coreference resolution models.

Revisiting Memory-Efficient Incremental Coreference Resolution

This work explores the task of coreference resolution under fixed memory by extending an incremental clustering algorithm to utilize contextualized encoders and neural components, leading to an asymptotic reduction in memory usage while remaining competitive on task performance.

End-to-end Deep Reinforcement Learning Based Coreference Resolution

This paper introduces an end-to-end reinforcement learning based coreference resolution model to directly optimize coreference evaluation metrics and introduces maximum entropy regularization for adequate exploration to prevent the model from prematurely converging to a bad local optimum.

Graph Refinement for Coreference Resolution

This work proposes a modelling approach that learns coreference at the document-level and takes global decisions, and model coreference links in a graph structure where the nodes are tokens in the text, and the edges represent the relationship between them.

Fast End-to-end Coreference Resolution for Korean

A BERT-SRU-based Pointer Networks that leverages the linguistic property of head-final languages to reduce the coreference linking search space and achieves 2x speedup in document processing time.
...

References

SHOWING 1-10 OF 17 REFERENCES

End-to-end Neural Coreference Resolution

This work introduces the first end-to-end coreference resolution model, trained to maximize the marginal likelihood of gold antecedent spans from coreference clusters and is factored to enable aggressive pruning of potential mentions.

Improving Coreference Resolution by Learning Entity-Level Distributed Representations

A neural network based coreference system that produces high-dimensional vector representations for pairs of coreference clusters that learns when combining clusters is desirable and substantially outperforms the current state of the art on the English and Chinese portions of the CoNLL 2012 Shared Task dataset.

Learning Global Features for Coreference Resolution

RNNs are proposed to be used to learn latent, global representations of entity clusters directly from their mentions, which are especially useful for the prediction of pronominal mentions, and can be incorporated into an end-to-end coreference system that outperforms the state of the art without requiring any additional search.

Latent Structures for Coreference Resolution

A unified representation of different approaches to coreference resolution in terms of the structure they operate on is proposed and a systematic analysis of the output of these approaches is conducted, highlighting differences and similarities.

Deep Reinforcement Learning for Mention-Ranking Coreference Models

This paper applies reinforcement learning to directly optimize a neural mention-ranking model for coreference evaluation metrics, resulting in significant improvements over the current state-of-the-art on the English and Chinese portions of the CoNLL 2012 Shared Task.

Entity-Centric Coreference Resolution with Model Stacking

This work trains an entity-centric coreference system that learns an effective policy for building up coreference chains incrementally by aggregating the scores produced by mention pair models to define powerful entity-level features between clusters of mentions.

Learning Structured Perceptrons for Coreference Resolution with Latent Antecedents and Non-local Features

This work investigates different ways of learning structured perceptron models for coreference resolution when using non-local features and beam search and obtains the best results to date on recent shared task data for Arabic, Chinese, and English.

Understanding the Value of Features for Coreference Resolution

This paper describes a rather simple pairwise classification model for coreference resolution, developed with a well-designed set of features and shows that this produces a state-of-the-art system that outperforms systems built with complex models.

Learning Anaphoricity and Antecedent Ranking Features for Coreference Resolution

We introduce a simple, non-linear mention-ranking model for coreference resolution that attempts to learn distinct feature representations for anaphoricity detection and antecedent ranking, which we

Specialized Models and Ranking for Coreference Resolution

This paper investigates two strategies for improving coreference resolution by training separate models that specialize in particular types of mentions and using a ranking loss function rather than a classification function, showing that on the ACE corpus both strategies deliver significant performance improvements.