Improved Transition-based Parsing by Modeling Characters instead of Words with LSTMs

We present extensions to a continuousstate dependency parsing method that makes it applicable to morphologically rich languages. Starting with a highperformance transition-based parser that uses long short-term memory (LSTM) recurrent neural networks to learn representations of the parser state, we replace lookup-based word representations with… CONTINUE READING

8 Figures & Tables



Citations per Year

173 Citations

Semantic Scholar estimates that this publication has 173 citations based on the available data.

See our FAQ for additional information.

  • Blog articles referencing this paper

  • Presentations referencing similar topics