• Publications
  • Influence
Adversarial Decomposition of Text Representation
TLDR
The proposed method for adversarial decomposition of text representation uses adversarial-motivational training and includes a special motivational loss, which acts opposite to the discriminator and encourages a better decomposition.
Adversarial Text Generation Without Reinforcement Learning
TLDR
Both human ratings and BLEU scores show that the model generates realistic text against competitive baselines, and visualization of sentence vectors indicate the model correctly learns the latent space of the autoencoder.
HumorHawk at SemEval-2017 Task 6: Mixing Meaning and Sound for Humor Recognition
TLDR
This paper describes the winning system for SemEval-2017 Task 6: #HashtagWars: Learning a Sense of Humor, which utilizes recurrent deep learning methods with dense embeddings to predict humorous tweets from the @midnight show # HashtagWars.
Injecting Hierarchy with U-Net Transformers
TLDR
This work proposes a novel architecture that combines ideas from Transformer and U-Net models to incorporate hierarchy at multiple levels of abstraction, and empirically demonstrates that the proposed architecture outperforms the vanilla Trans transformer and strong baselines in the chit-chat dialogue and machine translation domains.
Memory-Augmented Recurrent Networks for Dialogue Coherence
TLDR
This work introduces two separate dialogue architectures based on the Neural Turing Machine (NTM), and replaces the sequence-to-sequence architecture with a neural language model, to allow for longer context of the NTM and greater understanding of the dialogue history.
Label-Conditioned Next-Frame Video Generation with Neural Flows
TLDR
This work proposes using a state-of-the-art neural flow generator called Glow to generate videos conditioned on a textual label, one frame at a time, and evaluates the proposed Glow model by calculating cross entropy on a held-out validation set of videos, in order to compare multiple versions of the proposed model via an ablation study.