Dependency Parsing as Head Selection

@inproceedings{Lapata2017DependencyPA,
  title={Dependency Parsing as Head Selection},
  author={Mirella Lapata and Xingxing Zhang and Jianpeng Cheng},
  booktitle={EACL},
  year={2017}
}
Conventional graph-based dependency parsers guarantee a tree structure both during training and inference. Instead, we formalize dependency parsing as the problem of independently selecting the head of each word in a sentence. Our model which we call DENSE (as shorthand for Dependency Neural Selection) produces a distribution over possible heads for each word using features obtained from a bidirectional recurrent neural network. Without enforcing structural constraints during training, DeNSe… Expand
A neural parser as a direct classifier for head-final languages
TLDR
The neural parser performed well both on conventional Japanese corpora and the Japanese version of Universal Dependency corpus, and the advantages of distributed representations were observed in the comparison with the non-neural conventional model. Expand
Transition-based Semantic Dependency Parsing with Pointer Networks
TLDR
A transition system that, thanks to Pointer Networks, can straightforwardly produce labelled directed acyclic graphs and perform semantic dependency parsing and matches the best fully-supervised accuracy to date on the SemEval 2015 Task 18 datasets among previous state-of-the-art graph-based parsers. Expand
Dependency parsing with structure preserving embeddings
TLDR
This work seeks to learn interpretable representations by training a parser to explicitly preserve structural properties of a tree by incorporating geometric properties of dependency trees in the form of training losses within a graph-based parser. Expand
Read, Tag, and Parse All at Once, or Fully-neural Dependency Parsing
TLDR
A dependency parser implemented as a single deep neural network that reads orthographic representations of words and directly generates dependencies and their labels that reaches state-of-the-art performance on Slavic languages from the Universal Dependencies treebank. Expand
Multitask Pointer Network for Multi-Representational Parsing
TLDR
A Pointer Network architecture with two separate task-specific decoders and a common encoder is developed, and a multitask learning strategy to jointly train them, resulting in the first parser that can jointly produce both unrestricted constituent and dependency trees from a single model. Expand
Extracting Headless MWEs from Dependency Parse Trees: Parsing, Tagging, and Joint Modeling Approaches
TLDR
Experimental results show that tagging is more accurate than parsing for identifying flat-structure MWEs, and a joint decoder that reconciles the two different views and, for non-BERT features, leads to higher accuracies. Expand
Dependency Parsing with Bottom-up Hierarchical Pointer Networks
TLDR
A bottom-up-oriented Hierarchical Pointer Network for the left-to-right parser is developed and two novel transition-based alternatives are proposed: an approach that parses a sentence in right- to-left order and a variant that does it from the outside in. Expand
Bidirectional Transition-Based Dependency Parsing
TLDR
Empirical results show that the proposed simple framework for bidirectional transitionbased parsing methods lead to competitive parsing accuracy and the method based on dynamic oracle consistently achieves the best performance. Expand
Multitask Pointer Network for Korean Dependency Parsing
TLDR
A novel dependency-parsing framework called head-pointing--based dependency parsing is introduced that achieves state-of-the-art performance for Korean dependency parsing and does not require any handcrafted features or language-specific rules to parse dependency. Expand
Viable Dependency Parsing as Sequence Labeling
TLDR
This work recast dependency parsing as a sequence labeling problem, exploring several encodings of dependency trees as labels and showing that with a conventional BILSTM-based model it is possible to obtain fast and accurate parsers. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 80 REFERENCES
Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations
TLDR
The effectiveness of the BiLSTM approach is demonstrated by applying it to a greedy transition-based parser as well as to a globally optimized graph-basedparser. Expand
An Effective Neural Network Model for Graph-based Dependency Parsing
TLDR
This paper proposes a general and effective Neural Network model for graph-based dependency parsing that can automatically learn high-order feature combinations using only atomic features by exploiting a novel activation function tanhcube. Expand
Transition-based Dependency Parsing with Rich Non-local Features
TLDR
This paper shows that it can improve the accuracy of transition-based dependency parsers by considering even richer feature sets than those employed in previous systems by improving the accuracy in the standard Penn Treebank setup and rivaling the best results overall. Expand
Dependency Parsing
Dependency parsing has been a prime focus of NLP research of late due to its ability to help parse languages with a free word order. Dependency parsing has been shown to improve NLP systems inExpand
Incremental Parsing with Minimal Features Using Bi-Directional LSTM
TLDR
This work uses bi-directional LSTM sentence representations to model a parser state with only three sentence positions, which automatically identifies important aspects of the entire sentence, and achieves state-of-the-art results among greedy dependency parsers for English. Expand
Experiments with a Higher-Order Projective Dependency Parser
TLDR
In the multilingual exercise of the CoNLL-2007 shared task (Nivre et al., 2007), the system obtains the best accuracy for English, and the second best accuracies for Basque and Czech. Expand
Three New Probabilistic Models for Dependency Parsing: An Exploration
TLDR
Preliminary empirical results from evaluating the three models' parsing performance on annotated Wall Street Journal training text (derived from the Penn Treebank) suggest the generative model performs significantly better than the others, and does about equally well at assigning part-of-speech tags. Expand
Structured Training for Neural Network Transition-Based Parsing
TLDR
This work presents structured perceptron training for neural network transition-based dependency parsing, and provides indepth ablative analysis to determine which aspects of this model provide the largest gains in accuracy. Expand
A Tale of Two Parsers: Investigating and Combining Graph-based and Transition-based Dependency Parsing
TLDR
A beam-search-based parser that combines both graph-based and transition-based parsing into a single system for training and decoding is proposed, showing that it outperforms both the pure graph- based and the pure transition- based parsers. Expand
A Fast and Accurate Dependency Parser using Neural Networks
TLDR
This work proposes a novel way of learning a neural network classifier for use in a greedy, transition-based dependency parser that can work very fast, while achieving an about 2% improvement in unlabeled and labeled attachment scores on both English and Chinese datasets. Expand
...
1
2
3
4
5
...