Concise Integer Linear Programming Formulations for Dependency Parsing

@inproceedings{Martins2009ConciseIL,
  title={Concise Integer Linear Programming Formulations for Dependency Parsing},
  author={Andr{\'e} F. T. Martins and Noah A. Smith and Eric P. Xing},
  booktitle={ACL},
  year={2009}
}
We formulate the problem of non-projective dependency parsing as a polynomial-sized integer linear program. Our formulation is able to handle non-local output features in an efficient manner; not only is it compatible with prior knowledge encoded as hard constraints, it can also learn soft constraints from data. In particular, our model is able to learn correlations among neighboring arcs (siblings and grandparents), word valency, and tendencies toward nearly-projective parses. The model… 

Figures and Tables from this paper

Dependency Parsing with Bounded Block Degree and Well-nestedness via Lagrangian Relaxation and Branch-and-Bound
We present a novel dependency parsing method which enforces two structural properties on dependency trees: bounded block degree and well-nestedness. These properties are useful to better represent
Turbo Parsers: Dependency Parsing by Approximate Variational Inference
TLDR
A unified view of two state-of-the-art non-projective dependency parsers, both approximate, is presented and a new aggressive online algorithm to learn the model parameters is proposed, which makes use of the underlying variational representation.
Approximation Strategies for Multi-Structure Sentence Compression
TLDR
This work explores the use of Lagrangian relaxation to decouple the two subproblems of sentence compression and produces results comparable to a state-of-the-art integer linear programming formulation for the same joint inference task along with a significant improvement in runtime.
Branch and Bound Algorithm for Dependency Parsing with Non-local Features
  • Xian Qian, Yang Liu
  • Computer Science
    Transactions of the Association for Computational Linguistics
  • 2013
TLDR
This paper proposes an exact and efficient decoding algorithm based on the Branch and Bound (B&B) framework where non-local features are bounded by a linear combination of local features.
On Dual Decomposition and Linear Programming Relaxations for Natural Language Processing
TLDR
Dual decomposition as a framework for deriving inference algorithms for NLP problems relies on standard dynamic-programming algorithms as oracle solvers for sub-problems, together with a simple method for forcing agreement between the different oracles.
Dependency Parsing with Undirected Graphs
We introduce a new approach to transition-based dependency parsing in which the parser does not directly construct a dependency structure, but rather an undirected graph, which is then converted into
Dependency Parsing with Undirected Graphs
We introduce a new approach to transitionbased dependency parsing in which the parser does not directly construct a dependency structure, but rather an undirected graph, which is then converted into
Dual Decomposition for Parsing with Non-Projective Head Automata
TLDR
This paper introduces algorithms for non-projective parsing based on dual decomposition, a generalization of head-automata models to non- projective structures and the accuracy of the models is higher than previous work on a broad range of datasets.
Polyhedral outer approximations with application to natural language parsing
TLDR
This paper establishes risk bounds for max-margin learning with LP relaxed inference and addresses the second issue by proposing a new paradigm that attempts to penalize "time-consuming" hypotheses.
Transition-Based Techniques for Non-Projective Dependency Parsing
TLDR
It is found that the use of non-adjacent arc transitions may lead to a drop in accuracy on projective dependencies in the presence of long-distance non-projective dependencies, an effect that is not found for the two other techniques.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Incremental Integer Linear Programming for Non-projective Dependency Parsing
TLDR
This work presents an approach which solves the problem incrementally, thus it avoids creating intractable integer linear programs and shows how the addition of linguistically motivated constraints can yield a significant improvement over state-of-the-art.
Dependency Parsing by Belief Propagation
TLDR
This work shows how to apply loopy belief propagation (BP), a simple and effective tool for approximate learning and inference, which is both asymptotically and empirically efficient as a parsing algorithm.
On the Complexity of Non-Projective Data-Driven Dependency Parsing
In this paper we investigate several non-projective parsing algorithms for dependency parsing, providing novel polynomial time solutions under the assumption that each dependency decision is
Polyhedral outer approximations with application to natural language parsing
TLDR
This paper establishes risk bounds for max-margin learning with LP relaxed inference and addresses the second issue by proposing a new paradigm that attempts to penalize "time-consuming" hypotheses.
Non-Projective Dependency Parsing using Spanning Tree Algorithms
TLDR
Using this representation, the parsing algorithm of Eisner (1996) is sufficient for searching over all projective trees in O(n3) time and is extended naturally to non-projective parsing using Chu-Liu-Edmonds (Chu and Liu, 1965; Edmonds, 1967) MST algorithm, yielding an O( n2) parsing algorithm.
Global inference for sentence compression : an integer linear programming approach
TLDR
This work shows how previous formulations of sentence compression can be recast as ILPs and extend these models with novel global constraints to infer globally optimal compressions in the presence of linguistically motivated constraints.
Online Learning of Approximate Dependency Parsing Algorithms
In this paper we extend the maximum spanning tree (MST) dependency parsing framework of McDonald et al. (2005c) to incorporate higher-order feature representations and allow dependency structures
TAG, Dynamic Programming, and the Perceptron for Efficient, Feature-Rich Parsing
TLDR
A parsing approach that makes use of the perceptron algorithm, in conjunction with dynamic programming methods, to recover full constituent-based parse trees, using a Tree Adjoining Grammar (TAG) based parsing formalism.
Experiments with a Higher-Order Projective Dependency Parser
TLDR
In the multilingual exercise of the CoNLL-2007 shared task (Nivre et al., 2007), the system obtains the best accuracy for English, and the second best accuracies for Basque and Czech.
Stacking Dependency Parsers
TLDR
Experiments on twelve languages show that stacking transition-based and graph-based parsers improves performance over existing state-of-the-art dependency parsers.
...
1
2
3
4
...