Corpus ID: 1696516

Improved Variational Autoencoders for Text Modeling using Dilated Convolutions

@inproceedings{Yang2017ImprovedVA,
  title={Improved Variational Autoencoders for Text Modeling using Dilated Convolutions},
  author={Zichao Yang and Zhiting Hu and Ruslan Salakhutdinov and Taylor Berg-Kirkpatrick},
  booktitle={ICML},
  year={2017}
}
  • Zichao Yang, Zhiting Hu, +1 author Taylor Berg-Kirkpatrick
  • Published in ICML 2017
  • Computer Science
  • Recent work on generative modeling of text has found that variational auto-encoders (VAE) incorporating LSTM decoders perform worse than simpler LSTM language models (Bowman et al., 2015). This negative result is so far poorly understood, but has been attributed to the propensity of LSTM decoders to ignore conditioning information from the encoder. In this paper, we experiment with a new type of decoder for VAE: a dilated CNN. By changing the decoder's dilation architecture, we control the… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 138 CITATIONS

    Semi-Amortized Variational Autoencoders

    VIEW 9 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Trajectory-User Linking via Variational AutoEncoder

    VIEW 9 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Deconvolutional Latent-Variable Model for Text Sequence Matching

    VIEW 9 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    A Stable Variational Autoencoder for Text Modelling

    VIEW 7 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Enhancing Variational Autoencoders with Mutual Information Neural Estimation for Text Generation

    VIEW 6 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Variational Pretraining for Semi-supervised Text Classification

    VIEW 13 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Diversity regularized autoencoders for text generation

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 28 Highly Influenced Citations

    • Averaged 43 Citations per year from 2017 through 2019

    • 67% Increase in citations per year in 2019 over 2018

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 36 REFERENCES

    Generating Sentences from a Continuous Space

    VIEW 13 EXCERPTS
    HIGHLY INFLUENTIAL

    Conditional Image Generation with PixelCNN Decoders

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Neural Machine Translation in Linear Time

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Video Pixel Networks

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    WaveNet: A Generative Model for Raw Audio

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Neural Variational Inference for Text Processing

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Adam: A Method for Stochastic Optimization

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Categorical Reparameterization with Gumbel-Softmax

    VIEW 1 EXCERPT

    Deep Residual Learning for Image Recognition

    VIEW 3 EXCERPTS