• Publications
  • Influence
Learning by Stretching Deep Networks
tl;dr
In recent years, deep architectures have gained a lot of prominence for learning complex AI tasks because of their capability to incorporate complex variations in data within the model. Expand
  • 19
  • 4
  • Open Access
Exemplar Encoder-Decoder for Neural Conversation Generation
tl;dr
We present the Exemplar Encoder-Decoder network (EED), a novel conversation model that learns to utilize similar examples from training data to generate responses. Expand
  • 36
  • 3
  • Open Access
Variational methods for conditional multimodal deep learning
tl;dr
In this paper, we address the problem of conditional modality learning, whereby one is interested in generating one modality given the other. Expand
  • 29
  • 2
  • Open Access
To go deep or wide in learning?
tl;dr
In this paper, we propose an approach called wide learning based on arc-cosine kernels, that learns a single layer of infinite width. Expand
  • 15
  • Open Access
Unsupervised Feature Learning with Discriminative Encoder
tl;dr
In recent years, deep discriminative models have achieved extraordinary performance on supervised learning tasks, significantly outperforming their generative counterparts. Expand
  • 4
  • Open Access
Discriminative Neural Topic Models
tl;dr
We propose a neural network based approach for learning topics from text and image datasets that doesn’t explicitly model the distribution of observed features given the latent classes. Expand
  • 1
  • Open Access
Minimum description length principle for maximum entropy model selection
tl;dr
We treat the problem of selecting a maximum entropy model given various feature subsets and their moments, as a model selection problem, and present a minimum description length (MDL) formulation to solve this problem. Expand
  • 4
  • Open Access
Learning to Segment With Image-Level Supervision
tl;dr
We propose a model that generates auxiliary labels for each image, while simultaneously forcing the output of the CNN to satisfy the mean-field constraints imposed by a conditional random field. Expand
  • 1
Generative Maximum Entropy Learning for Multiclass Classification
tl;dr
Maximum entropy approach to classification is very well studied in applied statistics and machine learning and almost all the methods that exists in literature are discriminative in nature. Expand
  • 1
  • Open Access
On collapsed representation of hierarchical Completely Random Measures
tl;dr
The aim of the paper is to provide an exact approach for generating a Poisson process sampled from a hierarchical CRM, without having to instantiate the infinitely many atoms of random measures. Expand