• Publications
  • Influence
Neural Architectures for Named Entity Recognition
Comunicacio presentada a la 2016 Conference of the North American Chapter of the Association for Computational Linguistics, celebrada a San Diego (CA, EUA) els dies 12 a 17 de juny 2016.
  • 1,887
  • 400
  • Open Access
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
This work was sponsored in part by the U. S. Army Research Laboratory and the U. S. Army Research Office/nunder contract/grant number W911NF-10-1-0533, and in part by NSF CAREER grantExpand
  • 596
  • 76
  • Open Access
Recurrent Neural Network Grammars
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing andExpand
  • 316
  • 65
  • Open Access
Many Languages, One Parser
We train one multilingual model for dependency parsing and use it to parse sentences in several languages. The parsing model uses (i) multilingual word clusters and embeddings; (ii) token-levelExpand
  • 153
  • 34
  • Open Access
Improved Transition-based Parsing by Modeling Characters instead of Words with LSTMs
We present extensions to a continuousstate dependency parsing method that makes it applicable to morphologically rich languages. Starting with a highperformance transition-based parser that uses longExpand
  • 260
  • 27
  • Open Access
DyNet: The Dynamic Neural Network Toolkit
We describe DyNet, a toolkit for implementing neural network models based on dynamic declaration of network structure. In the static declaration strategy that is used in toolkits like Theano, CNTK,Expand
  • 303
  • 26
  • Open Access
MaltOptimizer: A System for MaltParser Optimization
Freely available statistical parsers often require careful optimization to produce state-of-the-art results, which can be a non-trivial task especially for application developers who are notExpand
  • 64
  • 13
  • Open Access
What Do Recurrent Neural Network Grammars Learn About Syntax?
Recurrent neural network grammars (RNNG) are a recently proposed probabilistic generative modeling family for natural language. They show state-of-the-art language modeling and parsing performance.Expand
  • 102
  • 12
  • Open Access
MaltOptimizer: An Optimization Tool for MaltParser
Data-driven systems for natural language processing have the advantage that they can easily be ported to any language or domain for which appropriate training data can be found. However, manyExpand
  • 67
  • 12
  • Open Access
Are Emojis Predictable?
Comunicacio presentada a la 15th Conference of the European Chapter of the Association for Computational Linguistics, celebrada els dies 3 a 7 d'abril de 2017 a Valencia, Espanya.
  • 77
  • 10
  • Open Access