Discontinuous Incremental Shift-reduce Parsing

@inproceedings{Maier2015DiscontinuousIS,
  title={Discontinuous Incremental Shift-reduce Parsing},
  author={Wolfgang Maier},
  booktitle={ACL},
  year={2015}
}
We present an extension to incremental shift-reduce parsing that handles discontinuous constituents, using a linear classifier and beam search. We achieve very high parsing speeds (up to 640 sent./sec.) and accurate results (up to 79.52 F1 on TiGer). 

Figures and Tables from this paper

Discontinuous parsing with continuous trees
We introduce a new method for incremental shift-reduce parsing of discontinuous constituency trees, based on the fact that discontinuous trees can be transformed into continuous trees by changing the
Discontinuous Constituent Parsing as Sequence Labeling
TLDR
This paper reduces discontinuous parsing to sequence labeling and proposes to encode tree discontinuities as nearly ordered permutations of the input sequence as well as studies whether such discontinuous representations are learnable.
A Derivational Model of Discontinuous Parsing
TLDR
The notion of latent-variable probabilistic context-free derivation of syntactic structures is enhanced to allow heads and unrestricted discontinuities and lends itself to intrinsic evaluation in terms of perplexity, as shown in experiments.
Incremental Discontinuous Phrase Structure Parsing with the GAP Transition
TLDR
A novel transition system for discontinuous lexicalized constituent parsing called SR-GAP is introduced, an extension of the shift-reduce algorithm with an additional gap transition that outperforms the previous best transition-based discontinuous parser by a large margin.
Transition-Based Left-Corner Parsing for Identifying PTB-Style Nonlocal Dependencies
TLDR
This paper proposes a left-corner parser which can identify nonlocal dependencies and uses a structured perceptron which enables the parser to utilize global features captured by non local dependencies.
Reducing Discontinuous to Continuous Parsing with Pointer Network Reordering
TLDR
A Pointer Network capable of accurately generating the continuous token arrangement for a given input sentence and define a bijective function to recover the original order is developed.
A Unifying Theory of Transition-based and Sequence Labeling Parsing
TLDR
A mapping from transition-based parsing algorithms that read sentences from left to right to sequence labeling encodings of syntactic trees is defined, and applying it to dependency parsing, sequence labeling versions of four algorithms are implemented.
Generic refinement of expressive grammar formalisms with an application to discontinuous constituent parsing
We formulate a generalization of Petrov et al. (2006)’s split/merge algorithm for interpreted regular tree grammars (Koller and Kuhlmann, 2011), which capture a large class of grammar formalisms. We
Advances in Using Grammars with Latent Annotations for Discontinuous Parsing
TLDR
It is found that the grammars presented are more accurate than previous approaches based on discontinuous grammar formalisms and early instances of the discriminative models but inferior to recent discriminatives parsers.
...
1
2
3
4
...

References

SHOWING 1-10 OF 44 REFERENCES
Discontinuity Revisited: An Improved Conversion to Context-Free Representations
TLDR
A labeled dependency evaluation shows that the new conversion method leads to better results by preserving local relationships and introducing fewer inconsistencies into the training data.
Language-Independent Parsing with Empty Elements
We present a simple, language-independent method for integrating recovery of empty elements into syntactic parsing. This method outperforms the best published method we are aware of on English and a
An Improved Oracle for Dependency Parsing with Online Reordering
TLDR
An improved training strategy for dependency parsers that use online reordering to handle non-projective trees improves both efficiency and accuracy by reducing the number of swap operations performed on non- projective trees by up to 80%.
Parsing as Reduction
TLDR
This work reduces phrase-representation parsing to dependency parsing, and shows that any off-the-shelf, trainable dependency parser can be used to produce constituents, and can perform discontinuous parsing in a very natural manner.
Antecedent Recovery: Experiments with a Trace Tagger
TLDR
This paper develops both a two- step approach which combines a trace tagger with a state-of-the-art lexicalized parser and a one-step approach which finds nonlocal dependencies while parsing.
Shift-Reduce CCG Parsing
TLDR
It is shown that the shift-reduce parser gives competitive accuracies compared to C&C, and given the high ambiguity levels in an automatically-extracted grammar and the amount of information in the CCG lexical categories which form the shift actions, this is a surprising result.
Efficient parsing with Linear Context-Free Rewriting Systems
TLDR
This work shows that parsing long sentences with such an optimally binarized grammar remains infeasible, and introduces a technique which removes this length restriction, while maintaining a respectable accuracy.
Non-Projective Dependency Parsing in Expected Linear Time
TLDR
A novel transition system for dependency parsing, which constructs arcs only between adjacent words but can parse arbitrary non-projective trees by swapping the order of words in the input, shows state-of-the-art accuracy.
Discontinuous Parsing with an Efficient and Accurate DOP Model
We present a discontinuous variant of tree-substitution grammar (tsg) based on Linear Context-Free Rewriting Systems. We use this formalism to instantiate a Data-Oriented Parsing model applied to
A Classifier-Based Parser with Linear Run-Time Complexity
TLDR
It is shown that, with an appropriate feature set used in classification, a very simple one-path greedy parser can perform at the same level of accuracy as more complex parsers.
...
1
2
3
4
5
...