Corpus ID: 203626686

Wasserstein Neural Processes

@article{Carr2019WassersteinNP,
  title={Wasserstein Neural Processes},
  author={A. Carr and Jared Nielson and D. Wingate},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.00668}
}
  • A. Carr, Jared Nielson, D. Wingate
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • Neural Processes (NPs) are a class of models that learn a mapping from a context set of input-output pairs to a distribution over functions. They are traditionally trained using maximum likelihood with a KL divergence regularization term. We show that there are desirable classes of problems where NPs, with this loss, fail to learn any reasonable distribution. We also show that this drawback is solved by using approximations of Wasserstein distance which calculates optimal transport distances… CONTINUE READING

    References

    SHOWING 1-10 OF 20 REFERENCES
    Attentive Neural Processes
    • 78
    • PDF
    Neural Processes
    • 84
    • PDF
    Conditional Neural Processes
    • 148
    • PDF
    Sliced Wasserstein Kernels for Probability Distributions
    • 61
    • PDF
    Approximate Bayesian computation with the Wasserstein distance
    • 25
    • PDF
    Sliced-Wasserstein Autoencoder: An Embarrassingly Simple Generative Model
    • 38
    • PDF
    Sinkhorn Distances: Lightspeed Computation of Optimal Transport
    • 1,034
    • PDF
    Adam: A Method for Stochastic Optimization
    • 52,851
    • PDF
    A Two-Step Computation of the Exact GAN Wasserstein Distance
    • 21
    • PDF
    Improved Training of Wasserstein GANs
    • 3,537
    • PDF