Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders

@inproceedings{Duan2020PretrainAP,
  title={Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders},
  author={Yu Duan and Jiaxin Pei and Canwen Xu and Chenliang Li},
  booktitle={ACL},
  year={2020}
}
  • Yu Duan, Jiaxin Pei, +1 author Chenliang Li
  • Published in ACL 2020
  • Computer Science, Mathematics
  • Conditional Text Generation has drawn much attention as a topic of Natural Language Generation (NLG) which provides the possibility for humans to control the properties of generated contents. Current conditional generation models cannot handle emerging conditions due to their joint end-to-end learning fashion. When a new condition added, these techniques require full retraining. In this paper, we present a new framework named Pre-train and Plug-in Variational Auto-Encoder (PPVAE) towards… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Citations

    Publications citing this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 43 REFERENCES

    Toward Controlled Generation of Text

    VIEW 11 EXCERPTS
    HIGHLY INFLUENTIAL

    Parameter-Efficient Transfer Learning for NLP

    VIEW 1 EXCERPT

    Attention is All you Need

    VIEW 1 EXCERPT

    Language Models are Unsupervised Multitask Learners

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL