Dependency Grammar Induction with a Neural Variational Transition-based Parser

@inproceedings{Li2019DependencyGI,
  title={Dependency Grammar Induction with a Neural Variational Transition-based Parser},
  author={Bowen Li and Jianpeng Cheng and Yang Liu and Frank Keller},
  booktitle={AAAI},
  year={2019}
}
Dependency grammar induction is the task of learning dependency syntax without annotated training data. [...] Key Method We train the parser with an integration of variational inference, posterior regularization and variance reduction techniques. The resulting framework outperforms previous unsupervised transition-based dependency parsers and achieves performance comparable to graph-based models, both on the English Penn Treebank and on the Universal Dependency Treebank. In an empirical comparison, we show that…Expand
Second-Order Unsupervised Neural Dependency Parsing
TLDR
This work proposes a second-order extension of unsupervised neural dependency models that incorporate grandparent-child or sibling information and proposes a novel design of the neural parameterization and optimization methods of the dependency models. Expand
Unsupervised Recurrent Neural Network Grammars
TLDR
An inference network parameterized as a neural CRF constituency parser is developed to maximize the evidence lower bound and apply amortized variational inference to unsupervised learning of RNNGs. Expand
A Regularization-based Framework for Bilingual Grammar Induction
TLDR
A framework in which the learning process of the grammar model of one language is influenced by knowledge from the model of another language, and three regularization methods that encourage similarity between model parameters, dependency edge scores, and parse trees are proposed. Expand
Enhancing Unsupervised Generative Dependency Parser with Contextual Information
TLDR
This paper proposes a novel probabilistic model called discriminative neural dependency model with valence (D-NDMV) that generates a sentence and its parse from a continuous latent representation, which encodes global contextual information of the generated sentence. Expand
Target Language-Aware Constrained Inference for Cross-lingual Dependency Parsing
TLDR
It is shown that weak supervisions of linguistic knowledge for the target languages can improve a cross-lingual graph-based dependency parser substantially and propose new algorithms that adapt two techniques, Lagrangian relaxation and posterior regularization, to conduct inference with corpus-statistics constraints. Expand
Semi-Supervised Dependency Parsing with Arc-Factored Variational Autoencoding
TLDR
A model based on the variational autoencoder framework is proposed that makes the learning of the model fully arc-factored and thus circumvent the challenges brought by the tree constraint. Expand
A Survey of Unsupervised Dependency Parsing
Syntactic dependency parsing is an important task in natural language processing. Unsupervised dependency parsing aims to learn a dependency parser from sentences that have no annotation of theirExpand
Recursive Tree Grammar Autoencoders
TLDR
This work proposes a novel autoencoder approach, called recursive tree grammar autoenCoder (RTG-AE), which encodes trees via a bottom-up parser and decodes trees through a tree grammar, both controlled by neural networks that minimize the variational autoen coder loss. Expand
Guiding Symbolic Natural Language Grammar Induction via Transformer-Based Sequence Probabilities
TLDR
A novel approach to automated learning of syntactic rules governing natural languages is proposed, based on using probabilities assigned to sentences by transformer neural network language models to guide symbolic learning processes like clustering and rule induction. Expand
Latent Template Induction with Gumbel-CRFs
TLDR
This work proposes a Gumbel-CRF, a continuous relaxation of the CRF sampling algorithm using a relaxed Forward-Filtering Backward-Sampling (FFBS) approach, which gives more stable gradients than score-function based estimators and shows that it learns interpretable templates during training, which allows us to control the decoder during testing. Expand
...
1
2
...

References

SHOWING 1-10 OF 37 REFERENCES
Fast Unsupervised Dependency Parsing with Arc-Standard Transitions
TLDR
This paper illustrates that by applying a supervised incremental parsing model to unsupervised parsing; parsing with a linear time complexity will be faster than the other methods. Expand
Unsupervised Neural Dependency Parsing
TLDR
A novel approach to unsupervised dependency parsing that uses a neural model to predict grammar rule probabilities based on distributed representation of POS tags that outperforms previous approaches utilizing POS correlations and is competitive with recent state-of-the-art approaches on nine different languages. Expand
CRF Autoencoder for Unsupervised Dependency Parsing
TLDR
An unsupervised dependency parsing model based on the CRF autoencoder that is discriminative and globally normalized which allows us to use rich features as well as universal linguistic priors and evaluated the performance of the model on eight multilingual treebanks. Expand
Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set
TLDR
A minimal feature set for transition-based dependency parsing is presented and the best unlabeled attachment score reported on the Chinese Treebank and the “second-best-in-class” result on the English Penn Treebank are achieved. Expand
Generative Incremental Dependency Parsing with Neural Networks
TLDR
A neural network model for scalable generative transition-based dependency parsing that surpasses the accuracy and speed of previous generative dependency parsers and shows a strong improvement over n-gram language models, opening the way to the efficient integration of syntax into neural models for language generation. Expand
A convex and feature-rich discriminative approach to dependency grammar induction
TLDR
This paper proposes a new convex formulation to the task of dependency grammar induction, based on the Frank-Wolfe algorithm, which is discriminative, allowing the use of different kinds of features. Expand
Global Transition-based Non-projective Dependency Parsing
TLDR
A transition-based interpretation of the MH₄ algorithm, an O(n^4) mildly nonprojective dynamic-programming parser with very high coverage on non-projective treebanks, is introduced in which parser items are mapped to sequences of transitions. Expand
Improving Unsupervised Dependency Parsing with Richer Contexts and Smoothing
TLDR
This paper introduces basic valence frames and lexical information into an unsupervised dependency grammar inducer and shows how this additional information can be leveraged via smoothing to produce state-of-the-art results. Expand
Guiding Unsupervised Grammar Induction Using Contrastive Estimation
TLDR
It is shown that, using the same features, log-linear dependency grammar models trained using CE can drastically outperform EMtrained generative models on the task of matching human linguistic annotations (the MATCHLINGUIST task). Expand
Using Left-corner Parsing to Encode Universal Structural Constraints in Grammar Induction
TLDR
A method to incorporate center-embedding into the grammar induction tasks by restricting the search space of a model to trees with limited centerembedding is described, which competes with the current state-ofthe-art model in a number of languages. Expand
...
1
2
3
4
...