Learning Accurate, Compact, and Interpretable Tree Annotation

@inproceedings{Petrov2006LearningAC,
  title={Learning Accurate, Compact, and Interpretable Tree Annotation},
  author={Slav Petrov and Leon Barrett and R. Thibaux and D. Klein},
  booktitle={ACL},
  year={2006}
}
We present an automatic approach to tree annotation in which basic nonterminal symbols are alternately split and merged to maximize the likelihood of a training treebank. Starting with a simple X-bar grammar, we learn a new grammar whose nonterminals are subsymbols of the original nonterminals. In contrast with previous work, we are able to split various terminals to different degrees, as appropriate to the actual complexity in the data. Our grammars automatically learn the kinds of linguistic… Expand
Learning and Inference for Hierarchically Split PCFGs
Simple, Accurate Parsing with an All-Fragments Grammar
Accurate Parsing with Compact Tree-Substitution Grammars: Double-DOP
Unsupervised Methods for Head Assignments
Toward Tree Substitution Grammars with Latent Annotations
Simple Semi-supervised Dependency Parsing
2 PETs and the Hidden Treebank
Fast and Accurate Unlexicalized Parsing via Structural Annotations
Judging Grammaticality with Tree Substitution Grammar Derivations
An All-Fragments Grammar for Simple and Accurate Parsing
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 25 REFERENCES
Inducing Head-Driven PCFGs with Latent Heads: Refining a Tree-Bank Grammar for Parsing
Probabilistic CFG with Latent Annotations
Recovering Latent Information in Treebanks
Accurate Unlexicalized Parsing
A Maximum-Entropy-Inspired Parser
Head-Driven Statistical Models for Natural Language Parsing
  • M. Collins
  • Computer Science
  • Computational Linguistics
  • 2003
PCFG Models of Linguistic Tree Representations
Inducing Probabilistic Grammars by Bayesian Model Merging
Tree-Bank Grammars
Coarse-to-Fine n-Best Parsing and MaxEnt Discriminative Reranking
...
1
2
3
...