• Publications
  • Influence
Neural Architectures for Named Entity Recognition
Comunicacio presentada a la 2016 Conference of the North American Chapter of the Association for Computational Linguistics, celebrada a San Diego (CA, EUA) els dies 12 a 17 de juny 2016.
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
TLDR
This work was sponsored in part by the U. S. Army Research Laboratory and the NSF CAREER grant IIS-1054319 and the European Commission. Expand
Recurrent Neural Network Grammars
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing andExpand
Many Languages, One Parser
TLDR
This work trains one multilingual model for dependency parsing and uses it to parse sentences in several languages, enabling the parser not only to parse effectively in multiple languages, but also to generalize across languages based on linguistic universals and typological similarities. Expand
DyNet: The Dynamic Neural Network Toolkit
TLDR
DyNet is a toolkit for implementing neural network models based on dynamic declaration of network structure that has an optimized C++ backend and lightweight graph representation and is designed to allow users to implement their models in a way that is idiomatic in their preferred programming language. Expand
Improved Transition-based Parsing by Modeling Characters instead of Words with LSTMs
TLDR
Extensions to a continuousstate dependency parsing method that makes it applicable to morphologically rich languages replace lookup-based word representations with representations constructed from the orthographic representations of the words, also using LSTMs. Expand
Universal Dependencies 2.1
TLDR
The annotation scheme is based on (universal) Stanford dependencies, Google universal part-of-speech tags, and the Interset interlingua for morphosyntactic tagsets for morpho-lingual tagsets. Expand
MaltOptimizer: A System for MaltParser Optimization
TLDR
MaltOptimizer is an interactive system that first performs an analysis of the training set in order to select a suitable starting point for optimization and then guides the user through the optimization of parsing algorithm, feature model, and learning algorithm. Expand
MaltOptimizer: An Optimization Tool for MaltParser
TLDR
MaltOptimizer is a tool developed to facilitate optimization of parsers developed using MaltParser, a data-driven dependency parser generator that performs an analysis of the training data and guides the user through a three-phase optimization process. Expand
What Do Recurrent Neural Network Grammars Learn About Syntax?
TLDR
By training grammars without nonterminal labels, it is found that phrasal representations depend minimally on nonterminals, providing support for the endocentricity hypothesis. Expand
...
1
2
3
4
5
...