A Gaussian Process Regression Model for Distribution Inputs

@article{Bachoc2018AGP,
  title={A Gaussian Process Regression Model for Distribution Inputs},
  author={F. Bachoc and F. Gamboa and Jean-Michel Loubes and N. Venet},
  journal={IEEE Transactions on Information Theory},
  year={2018},
  volume={64},
  pages={6620-6637}
}
Monge-Kantorovich distances, otherwise known as Wasserstein distances, have received a growing attention in statistics and machine learning as a powerful discrepancy measure for probability distributions. In this paper, we focus on forecasting a Gaussian process indexed by probability distributions. For this, we provide a family of positive definite kernels built using transportation based distances. We provide a probabilistic understanding of these kernels and characterize the corresponding… Expand
Wasserstein Regression.
Gaussian Processes indexed on the symmetric group: prediction and learning
Gaussian field on the symmetric group: Prediction and learning
Deep Probabilistic Kernels for Sample-Efficient Learning
Statistical Aspects of Wasserstein Distances.
...
1
2
3
...

References

SHOWING 1-10 OF 67 REFERENCES
Sliced Wasserstein Kernels for Probability Distributions
  • S. Kolouri, Yang Zou, G. Rohde
  • Computer Science, Mathematics
  • 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2016
Kernel Mean Embedding of Distributions: A Review and Beyonds
Asymptotic properties of a maximum likelihood estimator with data from a Gaussian process
Gromov-Wasserstein Averaging of Kernel and Distance Matrices
Distribution-Free Distribution Regression
...
1
2
3
4
5
...