• Publications
  • Influence
What Happened to My Dog in That Network: Unraveling Top-down Generators in Convolutional Neural Networks
TLDR
Top-down information plays a central role in human perception, but plays relatively little role in many current state-of-the-art deep networks, such as Convolutional Neural Networks. Expand
  • 5
  • 1
  • PDF
Rethinking Skip-thought: A Neighborhood based Approach
TLDR
We empirically show that, our skip-thought neighbor model performs as well as the skip- thought model on evaluation tasks. Expand
  • 15
  • PDF
Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding
TLDR
We propose an asymmetric encoder-decoder structure, which keeps an RNN as the encoder and has a CNN as the decoder, and it learns to encode the current sentence and decode the subsequent contiguous words all at once. Expand
  • 7
  • PDF
Trimming and Improving Skip-thought Vectors
TLDR
In this paper, we propose a suite of techniques to trim and improve the original skip-thought model by three techniques. Expand
  • 6
  • PDF
Exploring Asymmetric Encoder-Decoder Structure for Context-based Sentence Representation Learning
TLDR
In this paper, we explore an asymmetric encoder-decoder structure for unsupervised context-based sentence representation learning. Expand
  • 4
  • PDF
Improving Sentence Representations with Multi-view Frameworks
TLDR
We present two multi-view frameworks for learning sentence representations in an unsupervised fashion. Expand
  • 4
  • PDF
An Empirical Study on Post-processing Methods for Word Embeddings
TLDR
We re-examine the problem of post-processing word vectors as a shrinkage estimation of the true/underlying oracle gram matrix of word similarity, which is estimated by learning word vectors, towards a scaled identity matrix. Expand
  • 3
  • PDF
Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations
TLDR
We propose the TPRU, a recurrent unit that, at each time step, explicitly executes structural-role binding and unbinding operations to incorporate structural information into learning. Expand
  • 3
Multi-view Sentence Representation Learning
TLDR
We create a unified multi-view sentence representation learning framework, in which, one view encodes the input sentence with a Recurrent Neural Network (RNN), and the other view encode it with a simple linear model, and the training objective is to maximise the agreement specified by the adjacent context information between two views. Expand
  • 3
  • PDF
On Transfer Learning via Linearized Neural Networks
We propose to linearize neural networks for transfer learning via a first order Taylor approximation. Making neural networks linear in this way allows the optimization to become convex (or evenExpand
  • 1
  • PDF