Transition-Based Dependency Parsing with Stack Long Short-Term Memory

We propose a technique for learning representations of parser states in transitionbased dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks— the stack LSTM. Like the conventional stack data structures used in transitionbased parsing, elements can be pushed to or popped from the top of the stack in… CONTINUE READING

7 Figures & Tables

Topics

Statistics

0501001502015201620172018
Citations per Year

416 Citations

Semantic Scholar estimates that this publication has 416 citations based on the available data.

See our FAQ for additional information.