CREATER: CTR-driven Advertising Text Generation with Controlled Pre-Training and Contrastive Fine-Tuning

@article{Wei2022CREATERCA,
  title={CREATER: CTR-driven Advertising Text Generation with Controlled Pre-Training and Contrastive Fine-Tuning},
  author={Penghui Wei and Xuanhua Yang and Shaoguo Liu and Liang Wang and Bo Zheng},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.08943}
}
This paper focuses on automatically generating the text of an ad, and the goal is that the generated text can capture user interest for achieving higher click-through rate (CTR). We propose CREATER, a CTR-driven advertising text generation approach, to generate ad texts based on high-quality user reviews. To incorporate CTR objective, our model learns from online A/B test data with contrastive learning, which encourages the model to generate ad texts that obtain higher CTR. To make use of large… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 20 REFERENCES
Ad Headline Generation using Self-Critical Masked Language Model
TLDR
This work proposes a programmatic solution to generate product advertising headlines using retail content by jointly conditioning on multiple products that a seller wishes to advertise and demonstrates that the method outperforms existing Transformer and LSTM + RL methods in overlap metrics and quality audits.
Evolutionary Product Description Generation: A Dynamic Fine-Tuning Approach Leveraging User Click Behavior
TLDR
This paper proposes an evolutionary NTG model to enable its interactive environment to fine-tune the pre-trained generative policy via Reinforcement Learning (RL), and establishes a dynamic context of textual fitness based on the user click behavior associated with previously generated content to estimate reward/penalty signals for each output text.
Quality-Sensitive Training! Social Advertisement Generation by Leveraging User Click Behavior
TLDR
A novel seq2seq model to generate social advertisements automatically, in which a quality-sensitive loss function is proposed based on user click behavior to differentiate training samples of varied qualities to preserve the semantics of original input to the greatest extent is put forward.
Reinforcing Pretrained Models for Generating Attractive Text Advertisements
TLDR
A model-based reinforcement learning framework for text ad generation is proposed, which constructs a model for the environment dynamics and avoids large sample complexity and Masked-Sequence Policy Gradient, a reinforcement learning algorithm that integrates efficiently with pretrained models and explores the action space effectively is developed.
Generating Better Search Engine Text Advertisements with Deep Reinforcement Learning
TLDR
This work jointly train a model to minimize cross-entropy on an existing corpus of Landing Page/Text Ad pairs using typical sequence to sequence training techniques while also optimizing the expected click-through rate as predicted by an existing oracle model using SCST.
Automated snippet generation for online advertising
TLDR
A method that produces in an automated manner compact text ads (promotional text snippets), given as input a product description webpage (landing page), to produce a small comprehensive ad while maintaining at the same time relevance, clarity, and attractiveness.
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
TLDR
This systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks and achieves state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more.
PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
TLDR
This work proposes pre-training large Transformer-based encoder-decoder models on massive text corpora with a new self-supervised objective, PEGASUS, and demonstrates it achieves state-of-the-art performance on all 12 downstream datasets measured by ROUGE scores.
Get To The Point: Summarization with Pointer-Generator Networks
TLDR
A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator.
Contrastive Learning with Adversarial Perturbations for Conditional Text Generation
TLDR
This work proposes a principled method to generate positive and negative samples for contrastive learning of sequence-to-sequence models, and empirically shows that this method significantly improves the generalization of the seq2seq on three text generation tasks - machine translation, text summarization, and question generation.
...
...