Adversarial Decomposition of Text Representation

@article{Romanov2019AdversarialDO,
  title={Adversarial Decomposition of Text Representation},
  author={Alexey Romanov and Anna Rumshisky and Anna Rogers and David Donahue},
  journal={ArXiv},
  year={2019},
  volume={abs/1808.09042}
}
In this paper, we present a method for adversarial decomposition of text representation. [...] Key Method The model uses adversarial-motivational training and includes a special motivational loss, which acts opposite to the discriminator and encourages a better decomposition. Furthermore, we evaluate the obtained meaning embeddings on a downstream task of paraphrase detection and show that they significantly outperform the embeddings of a regular autoencoder.Expand
22 Citations
Multi-type Disentanglement without Adversarial Training
  • 1
  • PDF
SummAE: Zero-Shot Abstractive Text Summarization using Length-Agnostic Auto-Encoders
  • 7
  • PDF
Style Transfer for Texts: Retrain, Report Errors, Compare with Rewrites
  • 10
  • Highly Influenced
  • PDF
Deep Learning for Text Attribute Transfer: A Survey.
  • PDF
Attribute Alignment: Controlling Text Generation from Pre-trained Language Models
  • PDF
Style Transfer for Texts: to Err is Human, but Error Margins Matter
  • 2
  • Highly Influenced
Finding Experts in Transformer Models
  • 3
  • PDF
PowerTransformer: Unsupervised Controllable Revision for Biased Language Correction
  • 5
  • PDF
Decomposing Textual Information For Style Transfer
  • 7
  • PDF
...
1
2
3
...

References

SHOWING 1-10 OF 47 REFERENCES
Style Transfer in Text: Exploration and Evaluation
  • 275
  • PDF
Toward Controlled Generation of Text
  • 560
  • Highly Influential
  • PDF
Adversarially Regularized Autoencoders
  • 175
  • PDF
A Simple but Tough-to-Beat Baseline for Sentence Embeddings
  • 792
Adversarial Autoencoders
  • 1,120
  • Highly Influential
  • PDF
An efficient framework for learning sentence representations
  • 232
  • PDF
Context-aware Natural Language Generation with Recurrent Neural Networks
  • 48
  • PDF
...
1
2
3
4
5
...