• Publications
  • Influence
Generative Adversarial Text to Image Synthesis
TLDR
A novel deep architecture and GAN formulation is developed to effectively bridge advances in text and image modeling, translating visual concepts from characters to pixels.
An efficient framework for learning sentence representations
TLDR
This work reformulates the problem of predicting the context in which a sentence appears as a classification problem, and proposes a simple and efficient framework for learning sentence representations from unlabelled data.
Sentence Ordering and Coherence Modeling using Recurrent Neural Networks
TLDR
This work proposes an end- to-end unsupervised deep learning approach based on the set-to-sequence framework to address the structure of coherent texts and shows that useful text representations can be obtained by learning to order sentences.
Zero-Shot Entity Linking by Reading Entity Descriptions
TLDR
It is shown that strong reading comprehension models pre-trained on large unlabeled data can be used to generalize to unseen entities and proposed domain-adaptive pre-training (DAP) is proposed to address the domain shift problem associated with linking unseen entities in a new domain.
Content preserving text generation with attribute controls
TLDR
This work addresses the problem of modifying textual attributes of sentences by introducing a reconstruction loss which interpolates between auto-encoding and back-translation loss components and proposes an adversarial loss to enforce generated samples to be attribute compatible and realistic.
Sentence Ordering using Recurrent Neural Networks
TLDR
This work proposes an end-to-end neural approach based on the recently proposed set to sequence mapping framework to address the sentence ordering problem, achieving state-of-the-art performance in the order discrimination task on two datasets widely used in the literature.
Few-shot Sequence Learning with Transformers
TLDR
This work investigates few-shot learning in the setting where the data points are sequences of tokens and proposes an efficient learning algorithm based on Transformers that works at least as well as other methods, while being more computationally efficient.
Performance, Resource, and Cost Aware Resource Provisioning in the Cloud
TLDR
This work proposes a dynamic and computationally efficient reconfiguration scheme which comprises an Application Performance Model, a Cost Model, and a Reconfiguration algorithm to enable resource usage in the cloud.
Solving Jigsaw Puzzles using Paths and Cycles
TLDR
A compatibility measure based on the idea of cycles is defined and used to guide a greedy solver and beats state of the art performance and the improvements are significant in the more challenging situation of smaller piece size.
Performance, resource and cost aware virtual machine adaptation
TLDR
This work takes a novel perspective in addressing the auto-scaling problem where it assumes that the cloud operator exposes a small, dynamic fraction of its infrastructure and the corresponding resource speci cations and constraints to each application, and proposes a dynamic VM recon guration scheme.
...
...