Globally Normalized Transition-Based Neural Networks

@article{Andor2016GloballyNT,
  title={Globally Normalized Transition-Based Neural Networks},
  author={D. Andor and Chris Alberti and David Weiss and Aliaksei Severyn and A. Presta and K. Ganchev and Slav Petrov and Michael Collins},
  journal={ArXiv},
  year={2016},
  volume={abs/1603.06042}
}
We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. We discuss the importance of global as opposed to local normalization: a key insight is that the label bias problem implies that globally normalized… Expand
Neural Joint Model for Transition-based Chinese Syntactic Analysis
A Globally Normalized Neural Model for Semantic Parsing
Read, Tag, and Parse All at Once, or Fully-neural Dependency Parsing
Joint POS Tagging and Dependence Parsing With Transition-Based Neural Networks
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Structured Training for Neural Network Transition-Based Parsing
Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error States
Improved Transition-Based Parsing and Tagging with Neural Networks
A Neural Probabilistic Structured-Prediction Model for Transition-Based Dependency Parsing
Transition-based Neural Constituent Parsing
Recurrent conditional random field for language understanding
Natural Language Processing (Almost) from Scratch
...
1
2
3
4
5
...