Improving Disentangled Text Representation Learning with Information-Theoretic Guidance

@article{Cheng2020ImprovingDT,
  title={Improving Disentangled Text Representation Learning with Information-Theoretic Guidance},
  author={Pengyu Cheng and Martin Renqiang Min and Dinghan Shen and Christopher Malon and Yizhe Zhang and Yitong Li and L. Carin},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.00693}
}
Learning disentangled representations of natural language is essential for many NLP tasks, e.g., conditional text generation, style transfer, personalized dialogue systems, etc. Similar problems have been studied extensively for other forms of data, such as images and videos. However, the discrete nature of natural language makes the disentangling of textual representations more challenging (e.g., the manipulation over the data space cannot be easily achieved). Inspired by information theory… Expand
10 Citations
Semi-Disentangled Representation Learning in Recommendation System
  • PDF
FairFil: Contrastive Neural Debiasing Method for Pretrained Text Encoders
  • PDF
Improving Adversarial Text Generation by Modeling the Distant Future
  • 2
  • PDF
Improving Zero-shot Voice Style Transfer via Disentangled Representation Learning
  • PDF
DISENTANGLED REPRESENTATION LEARNING
  • PDF
Disentangling semantics in language through VAEs and a certain architectural choice
  • PDF
FIND: Human-in-the-Loop Debugging Deep Text Classifiers
  • 10
  • PDF
Estimating Total Correlation with Mutual Information Bounds
  • PDF
A Novel Estimator of Mutual Information for Learning to Disentangle Textual Representations
  • Highly Influenced
  • PDF
CLUB: A Contrastive Log-ratio Upper Bound of Mutual Information
  • 10
  • PDF

References

SHOWING 1-10 OF 42 REFERENCES
Disentangled Representation Learning for Non-Parallel Text Style Transfer
  • 86
  • Highly Influential
  • PDF
Multiple-Attribute Text Rewriting
  • 84
  • Highly Influential
Toward Controlled Generation of Text
  • 560
  • Highly Influential
  • PDF
Improving Adversarial Text Generation by Modeling the Distant Future
  • 2
  • PDF
Detach and Adapt: Learning Cross-Domain Disentangled Deep Representation
  • 62
  • PDF
InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
  • 2,440
  • Highly Influential
  • PDF
Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations
  • 418
  • PDF
Disentangled Sequential Autoencoder
  • 109
  • PDF
DRIT++: Diverse Image-to-Image Translation via Disentangled Representations
  • 120
  • PDF
...
1
2
3
4
5
...