Inductive Dependency Parsing Joakim Nivre (Vxj University) Dordrecht: Springer (Text, speech, and language technology series, edited by Nancy Ide and Jean Vronis, volume 34), 2006, xi+216 pp; hardbound, ISBN 1-4020-4888-2, $119.00, 89.95

@article{Samuelsson2007InductiveDP,
  title={Inductive Dependency Parsing Joakim Nivre (Vxj University) Dordrecht: Springer (Text, speech, and language technology series, edited by Nancy Ide and Jean Vronis, volume 34), 2006, xi+216 pp; hardbound, ISBN 1-4020-4888-2, \$119.00, 89.95},
  author={Christer Samuelsson},
  journal={Computational Linguistics},
  year={2007},
  volume={33},
  pages={267-269}
}
  • C. Samuelsson
  • Published 2007
  • Computer Science
  • Computational Linguistics
Philologists assure us that it’s worth learning ancient Greek just to read Homer. For any linguist, it’s definitely worth learning French, just to read Lucien Tesnière’s Elements de Syntaxe Structurale (Tesnière 1959). For any serious dependency parsing student or professional, it would have been worth learning Swedish, just to read Joakim Nivre’s Inductive Dependency Parsing if Nivre had not done the world the immense favor of writing his book in English, rather than in his native tongue of… 

References

SHOWING 1-10 OF 27 REFERENCES
Head-Driven Statistical Models for Natural Language Parsing
  • M. Collins
  • Computer Science
    Computational Linguistics
  • 2003
TLDR
Three statistical models for natural language parsing are described, leading to approaches in which a parse tree is represented as the sequence of decisions corresponding to a head-centered, top-down derivation of the tree.
An Empirical Comparison of Probability Models for Dependency Grammar
TLDR
The present paper describes some details of the experiments and repeats them with a larger training set of 25,000 sentences, finding that the parser of Collins (1996), when combined with a highlytrained tagger, also achieves 93% when trained and tested on the same sentences.
Forgetting Exceptions is Harmful in Language Learning
TLDR
It is shown that in language learning, contrary to received wisdom, keeping exceptional training instances in memory can be beneficial for generalization accuracy, and that decision-tree learning often performs worse than memory-based learning.
A Maximum-Entropy-Inspired Parser
TLDR
A new parser for parsing down to Penn tree-bank style parse trees that achieves 90.1% average precision/recall for sentences of length 40 and less and 89.5% when trained and tested on the previously established sections of the Wall Street Journal treebank is presented.
Guides and Oracles for Linear-Time Parsing
  • M. Kay
  • Computer Science
    IWPT
  • 2000
TLDR
The stratagem of separating read-out from chart construction can also be applied to other kinds of parser, in particular, to left-comer parsers that use early composition.
An Efficient Augmented-Context-Free Parsing Algorithm
  • M. Tomita
  • Computer Science
    Comput. Linguistics
  • 1987
An efficient parsing algorithm for augmented context-free grammars is introduced, and its application to on-line natural language interfaces discussed. The algorithm is a generalized LR parsing
A Statistical Parser for Czech
This paper considers statistical parsing of Czech, which differs radically from English in at least two respects: (1) it is a highly inflected language, and (2) it has relatively free word order.
A Memory-Based Alternative for Connectionist Shift-Reduce Parsing
TLDR
It is argued that mbl is an alternative psychologically plausible model for human parsing, next to neural network models.
Building a Large Annotated Corpus of English: The Penn Treebank
TLDR
As a result of this grant, the researchers have now published on CDROM a corpus of over 4 million words of running text annotated with part-of- speech (POS) tags, which includes a fully hand-parsed version of the classic Brown corpus.
An Efficient Algorithm for Projective Dependency Parsing
TLDR
This paper presents a deterministic parsing algorithm for projective dependency grammar that has been experimentally evaluated in parsing unrestricted Swedish text, achieving an accuracy above 85% with a very simple grammar.
...
1
2
3
...