• Publications
  • Influence
Linguistic Regularities in Continuous Space Word Representations
TLDR
We examine the vector-space word representations that are implicitly learned by the input-layer weights. Expand
Syntactic Clustering of the Web
TLDR
We have developed an efficient way to determine the syntactic similarity of files and have applied it to every document on the World Wide Web. Expand
From captions to visual concepts and back
TLDR
This paper presents a novel approach for automatically generating image descriptions: visual detectors, language models, and multimodal similarity models learnt directly from a dataset of image captions. Expand
An introduction to computational networks and the computational network toolkit (invited talk)
TLDR
We introduce computational network (CN), a unified framework for describing arbitrary learning machines, such as deep neural networks, convolutional neural network (CNNs), recurrent neural networks (RNNs), long short term memory (LSTM), logistic regression, and maximum entropy model, that can be illustrated as a series of computational steps. Expand
Context dependent recurrent neural network language model
TLDR
We propose a topic-conditioned RNNLM based on Latent Dirichlet Allocation using a block of preceding text using a contextual real-valued input vector in association with each word. Expand
Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding
TLDR
We propose the use of recurrent neural networks for the SLU slot filling task, and present several novel architectures designed to efficiently model past and future temporal dependencies. Expand
Hybrid Code Networks: practical and efficient end-to-end dialog control with supervised and reinforcement learning
TLDR
We introduce Hybrid Code Networks for end-to-end learning of task-oriented dialog systems, which combine an RNN with domain-specific knowledge encoded as software and system action templates. Expand
Spoken language understanding using long short-term memory neural networks
TLDR
We investigate using long short-term memory (LSTM) neural networks, which contain input, output and forgetting gates and are more advanced than simple RNN, for the word labeling task. Expand
Recurrent neural networks for language understanding
TLDR
We present an adaptation of Recurrent Neural Networks to perform Language Understanding, and advance the state-of-the-art for the widely used ATIS dataset. Expand
Achieving Human Parity in Conversational Speech Recognition
TLDR
Conversational speech recognition has served as a flagship speech recognition task since the release of the Switchboard corpus in the 1990s. Expand
...
1
2
3
4
5
...