• Publications
  • Influence
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
TLDR
In this paper, we propose a novel neural network model called RNN Encoder‐ Decoder that consists of two recurrent neural networks (RNN). Expand
Neural Machine Translation by Jointly Learning to Align and Translate
TLDR
Neural machine translation is a recently proposed approach to machine translation. Expand
On the Properties of Neural Machine Translation: Encoder-Decoder Approaches
TLDR
We focus on analyzing the properties of the neural machine translation using two models; RNN Encoder--Decoder and a newly proposed gated recursive convolutional neural network. Expand
Theano: A Python framework for fast computation of mathematical expressions
TLDR
Theano is a Python library that allows to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Expand
Attention-Based Models for Speech Recognition
TLDR
We extend the attention-mechanism with features needed for speech recognition and propose a novel method of adding location-awareness to the attention mechanism to alleviate this issue. Expand
An Actor-Critic Algorithm for Sequence Prediction
TLDR
We present an approach to training neural networks to generate sequences using actor-critic methods from reinforcement learning (RL). Expand
End-to-end attention-based large vocabulary speech recognition
TLDR
We investigate an alternative method for sequence modelling based on an attention mechanism that allows a Recurrent Neural Network (RNN) to learn alignments between sequences of input frames and output labels. Expand
End-to-end Continuous Speech Recognition using Attention-based Recurrent NN: First Results
TLDR
We replace the Hidden Markov Model (HMM) which is traditionally used in in continuous speech recognition with a bi-directional recurrent neural network encoder coupled to a recurrent neural Network decoder that directly emits a stream of phonemes. Expand
Systematic Generalization: What Is Required and Can It Be Learned?
TLDR
Numerous models for grounded language understanding are proposed, including (i) generic models that can be easily adapted to any given task and (ii) intuitively appealing modular models that require background knowledge to be instantiated. Expand
BabyAI: First Steps Towards Grounded Language Learning With a Human In the Loop
TLDR
We introduce the BabyAI research platform to support investigations towards including humans in the loop for grounded language learning. Expand
...
1
2
3
4
...