• Publications
  • Influence
Meta Networks
TLDR
We introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. Expand
  • 376
  • 26
  • PDF
Neural Semantic Encoders
TLDR
We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders. Expand
  • 119
  • 11
  • PDF
The CHEMDNER corpus of chemicals and drugs and its annotation principles
TLDR
We present the CHEMDNER corpus, a collection of 10,000 PubMed abstracts that contain a total of 84,355 chemical entity mentions labeled manually by expert chemistry literature curators, following annotation guidelines specifically defined for this task. Expand
  • 214
  • 10
  • PDF
Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension
TLDR
We propose a neural machine-reading model that constructs dynamic knowledge graphs from procedural text, and use them to track the evolving states of participant entities. Expand
  • 50
  • 9
  • PDF
Rapid Adaptation with Conditionally Shifted Neurons
TLDR
We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. Expand
  • 118
  • 5
  • PDF
Neural Tree Indexers for Text Understanding
TLDR
We introduce a robust syntactic parsing-independent tree structured model, Neural Tree Indexers (NTI) that provides a middle ground between the sequential RNNs and the syntactic tree-based recursive models. Expand
  • 90
  • 4
  • PDF
Incorporating domain knowledge in chemical and biomedical named entity recognition with word representations
TLDR
We present a semi-supervised learning method that efficiently exploits unlabeled data in order to incorporate domain knowledge into a named entity recognition model and to leverage system performance. Expand
  • 44
  • 2
  • PDF
Sentence Simplification with Memory-Augmented Neural Networks
TLDR
We adapt an architecture with augmented memory capacities called Neural Semantic Encoders (Munkhdalai and Yu, 2017) for sentence simplification. Expand
  • 24
  • 2
  • PDF
Reasoning with Memory Augmented Neural Networks for Language Comprehension
TLDR
We introduce a computational hypothesis testing approach based on memory augmented neural networks based on Neural Semantic Encoders (NSE). Expand
  • 24
  • 2
  • PDF
Self-Supervised Meta-Learning for Few-Shot Natural Language Classification Tasks
TLDR
This paper proposes a self-supervised approach to generate a large, rich, meta-learning task distribution from unlabeled text. Expand
  • 5
  • 2
  • PDF
...
1
2
3
4
...