• Publications
  • Influence
Plug and Play Autoencoders for Conditional Text Generation
TLDR
Evaluations on style transfer tasks both with and without sequence-to-sequence supervision show that the proposed plug and play Emb2Emb method performs better than or comparable to strong baselines while being up to four times faster.
Sentence Bottleneck Autoencoders from Transformer Language Models
TLDR
The construction of a sentence-level autoencoder from a pretrained, frozen transformer language model that achieves better quality than previous methods that extract representations from pretrained transformers on text similarity tasks, style transfer, and single-sentence classification tasks in the GLUE benchmark, while using fewer parameters than large pretrained models.
Pivot Through English: Reliably Answering Multilingual Questions without Document Retrieval
TLDR
Circumventing retrieval, this approach offers rapid answer generation to almost any language off-the-shelf, without the need for any additional training data in the target language.