Corpus ID: 237263305

A Recipe For Arbitrary Text Style Transfer with Large Language Models

@article{Reif2021ARF,
  title={A Recipe For Arbitrary Text Style Transfer with Large Language Models},
  author={Emily Reif and Daphne Ippolito and Ann Yuan and Andy Coenen and Chris Callison-Burch and Jason Wei},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.03910}
}
In this paper, we leverage large language mod001 els (LMs) to perform zero-shot text style trans002 fer. We present a prompting method that 003 we call augmented zero-shot learning, which 004 frames style transfer as a sentence rewriting 005 task and requires only a natural language in006 struction, without model fine-tuning or exem007 plars in the target style. Augmented zero-shot 008 learning is simple and demonstrates promising 009 results not just on standard style transfer tasks 010 such… Expand
Few-shot Controllable Style Transfer for Low-Resource Settings: A Study in Indian Languages
TLDR
This work pushes the state-of-the-art for few-shot style transfer with a new method modeling the stylistic difference between paraphrases, and achieves 2-3x better performance and output diversity in formality transfer and code-mixing addition across five Indian languages. Expand
From Theories on Styles to their Transfer in Text: Bridging the Gap with a Hierarchical Survey
TLDR
A comprehensive discussion of the styles that have received attention in the transfer task is provided, organized into a hierarchy, highlighting the challenges for the definition of each of them, and pointing out gaps in the current research landscape. Expand
Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation
TLDR
This work extends Emb2Emb to Bag-of-Vectors Autoencoders (BoV-AEs), which encode the text into a variablesize bag of vectors that grows with the size of the text, as in attention-based models, and proposes regularization techniques that facilitate learning meaningful operations in the latent space. Expand
AI Chains: Transparent and Controllable Human-AI Interaction by Chaining Large Language Model Prompts
TLDR
This work defines a set of LLM primitive operations useful for Chain construction, then presents an interactive system where users can modify these Chains, along with their intermediate results, in a modular way, and explores how LLM Chains may be used in future applications. Expand
Finetuned Language Models Are Zero-Shot Learners
TLDR
It is shown that instruction tuning—finetuning language models on a collection of datasets described via instructions—substantially boosts zeroshot performance on unseen tasks and improves the zero-shot learning abilities of language models. Expand

References

SHOWING 1-10 OF 47 REFERENCES
“Transforming” Delete, Retrieve, Generate Approach for Controlled Text Style Transfer
TLDR
This work introduces the Generative Style Transformer (GST) - a new approach to rewriting sentences to a target style in the absence of parallel style corpora, which outperform state-of-art systems across 5 datasets on sentiment, gender and political slant transfer. Expand
Reformulating Unsupervised Style Transfer as Paraphrase Generation
TLDR
This paper reformulates unsupervised style transfer as a paraphrase generation problem, and presents a simple methodology based on fine-tuning pretrained language models on automatically generated paraphrase data that significantly outperforms state-of-the-art style transfer systems on both human and automatic evaluations. Expand
TextSETTR: Label-Free Text Style Extraction and Tunable Targeted Restyling
TLDR
This work shows that T5 (Raffel et al., 2020), a strong pretrained text-to-text model, can be adapted to extract a style vector from arbitrary text and use this vector to condition the decoder to perform style transfer, and recast transfers as "targeted restyling" vector operations that adjust specific attributes of the input text while preserving others. Expand
Dear Sir or Madam, May I Introduce the GYAFC Dataset: Corpus, Benchmarks and Metrics for Formality Style Transfer
TLDR
This work creates the largest corpus for a particular stylistic transfer (formality) and shows that techniques from the machine translation community can serve as strong baselines for future work. Expand
Style Transfer Through Back-Translation
TLDR
A latent representation of the input sentence is learned which is grounded in a language translation model in order to better preserve the meaning of the sentence while reducing stylistic properties, and adversarial generation techniques are used to make the output match the desired style. Expand
Zero-shot Text Classification With Generative Language Models
TLDR
This work investigates the use of natural language to enable zero-shot model adaptation to new tasks, using text and metadata from social commenting platforms as a source for a simple pretraining task and shows that natural language can serve as simple and powerful descriptors for task adaptation. Expand
Style Transfer from Non-Parallel Text by Cross-Alignment
TLDR
This paper proposes a method that leverages refined alignment of latent representations to perform style transfer on the basis of non-parallel text, and demonstrates the effectiveness of this cross-alignment method on three tasks: sentiment modification, decipherment of word substitution ciphers, and recovery of word order. Expand
Style Transfer in Text: Exploration and Evaluation
TLDR
Two models are explored to learn style transfer with non-parallel data to learn separate content representations and style representations using adversarial networks, and a novel evaluation metrics which measure two aspects of style transfer: transfer strength and content preservation. Expand
Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation
TLDR
The Style Transformer is proposed, which makes no assumption about the latent representation of source sentence and equips the power of attention mechanism in Transformer to achieve better style transfer and better content preservation. Expand
A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer
Unsupervised text style transfer aims to transfer the underlying style of text but keep its main content unchanged without parallel data. Most existing methods typically follow two steps: firstExpand
...
1
2
3
4
5
...