Approximate Bayesian Computation with the Sliced-Wasserstein Distance
@article{Nadjahi2019ApproximateBC, title={Approximate Bayesian Computation with the Sliced-Wasserstein Distance}, author={Kimia Nadjahi and Valentin De Bortoli and Alain Durmus and Roland Badeau and Umut Simsekli}, journal={ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)}, year={2019}, pages={5470-5474} }
Approximate Bayesian Computation (ABC) is a popular method for approximate inference in generative models with intractable but easy-to-sample likelihood. It constructs an approximate posterior distribution by finding parameters for which the simulated data are close to the observations in terms of summary statistics. These statistics are defined beforehand and might induce a loss of information, which has been shown to deteriorate the quality of the approximation. To overcome this problem…
11 Citations
Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections
- Computer ScienceNeurIPS
- 2021
This work adopts a new perspective to approximate SW by making use of the concentration of measure phenomenon, and develops a simple deterministic approximation that is both accurate and easy to use compared to the usual Monte Carlo approximation.
Shedding a PAC-Bayesian Light on Adaptive Sliced-Wasserstein Distances
- Computer ScienceArXiv
- 2022
The PAC-Bayesian theory and the central observation that SW actually hinges on a slice-distribution-dependent Gibbs risk are leveraged to bring new contributions to this line of research.
Statistical, Robustness, and Computational Guarantees for Sliced Wasserstein Distances
- Computer ScienceArXiv
- 2022
This work quantifies sliced Wasserstein distances scalability from three key aspects: empirical convergence rates; robustness to data contamination; and efficient computational methods; and characterize minimax optimal, dimension-free robust estimation risks, and show an equivalence between robust 1-Wasserstein estimation and robust mean estimation.
Minimax confidence intervals for the Sliced Wasserstein distance
- Computer ScienceElectronic Journal of Statistics
- 2022
Confidence intervals for the Sliced Wasserstein distance are constructed which have finite-sample validity under no assumptions or under mild moment assumptions and are adaptive in length to the regularity of the underlying distributions.
Discrepancy-based Inference for Intractable Generative Models using Quasi-Monte Carlo
- Computer Science
- 2021
Results are sample complexity bounds which demonstrate that, under smoothness conditions on the generator, QMC can significantly reduce the number of samples required to obtain a given level of accuracy when using three of the most common discrepancies: the maximum mean discrepancy, the Wasserstein distance, and the Sinkhorn divergence.
Fast Approximation of the Generalized Sliced-Wasserstein Distance
- Computer ScienceArXiv
- 2022
This work proposes to form deterministic and fast approximations of the generalized sliced Wasserstein distance by using the concentration of random projections when the defining functions are polynomial function, circular function, and neural network type function.
Generalized Sliced Distances for Probability Distributions
- Computer ScienceArXiv
- 2020
A broad family of probability metrics, coined as Generalized Sliced Probability Metrics (GSPMs), that are deeply rooted in the generalized Radon transform are introduced and it is shown that under mild assumptions, the gradient flow converges to the global optimum.
Hierarchical Sliced Wasserstein Distance
- Computer ScienceArXiv
- 2022
The metricity of Hierarchical Sliced Wasserstein (HSW) distance is derived by proving the injectivity of HRT and investigating the theoretical properties of HSW including its connection to SW variants and its computational and sample complexities.
Revisiting Sliced Wasserstein on Images: From Vectorization to Convolution
- Computer ScienceArXiv
- 2022
Convolution sliced Wasserstein (CSW) is derived via incorporating stride, dilation, and non-linear activation function into the convolution operators and is demonstrated to have favorable performance in comparing probability measures over images and in training deep generative modeling on images.
On Transportation of Mini-batches: A Hierarchical Approach
- Computer ScienceICML
- 2022
This work proposes a novel mini-batch scheme for optimal transport, named Batch of Mini-batches Optimal Transport (BoMb-OT), that satisfies the optimal coupling between mini-batched and it can be seen as an approximation to a well-defined distance on the space of probability measures.
References
SHOWING 1-10 OF 30 REFERENCES
Approximate Bayesian computation with the Wasserstein distance
- Computer ScienceJournal of the Royal Statistical Society: Series B (Statistical Methodology)
- 2019
This work proposes to avoid the use of summaries and the ensuing loss of information by instead using the Wasserstein distance between the empirical distributions of the observed and synthetic data, and generalizes the well‐known approach of using order statistics within approximate Bayesian computation to arbitrary dimensions.
K2-ABC: Approximate Bayesian Computation with Kernel Embeddings
- Computer ScienceAISTATS
- 2016
This paper proposes a fully nonparametric ABC paradigm which circumvents the need for manually selecting summary statistics, and uses maximum mean discrepancy (MMD) as a dissimilarity measure between the distributions over observed and simulated data.
Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy
- Computer ScienceAISTATS
- 2018
This work adopts a Kullback-Leibler divergence estimator to assess the data discrepancy and achieves a comparable or higher quasiposterior quality, compared to the existing methods using other discrepancy measures.
Asymptotic Guarantees for Learning Generative Models with the Sliced-Wasserstein Distance
- Computer Science, MathematicsNeurIPS
- 2019
A central limit theorem is proved, which characterizes the asymptotic distribution of the estimators and establishes a convergence rate of $\sqrt{n}$, where $n$ denotes the number of observed data points.
Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions
- Computer ScienceICML
- 2019
This study proposes a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them based on a functional optimization problem, which aims at finding a measure that is close to the data distribution as much as possible and also expressive enough for generative modeling purposes.
Generalized Sliced Wasserstein Distances
- Computer ScienceNeurIPS
- 2019
The generalized Radon transform is utilized to define a new family of distances for probability measures, which are called generalized sliced-Wasserstein (GSW) distances, and it is shown that, similar to the SW distance, the GSW distance can be extended to a maximum GSW (max- GSW) distance.
Generative Modeling Using the Sliced Wasserstein Distance
- Computer Science2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
- 2018
This work considers an alternative formulation for generative modeling based on random projections which, in its simplest form, results in a single objective rather than a saddle-point formulation and finds its approach to be significantly more stable compared to even the improved Wasserstein GAN.
Constructing summary statistics for approximate Bayesian computation: semi‐automatic approximate Bayesian computation
- Computer Science
- 2012
This work shows how to construct appropriate summary statistics for ABC in a semi‐automatic manner, and shows that optimal summary statistics are the posterior means of the parameters.
Approximate Bayesian computation in population genetics.
- Computer ScienceGenetics
- 2002
A key advantage of the method is that the nuisance parameters are automatically integrated out in the simulation step, so that the large numbers of nuisance parameters that arise in population genetics problems can be handled without difficulty.
Sharp asymptotic and finite-sample rates of convergence of empirical measures in Wasserstein distance
- Mathematics, Computer ScienceBernoulli
- 2019
This work considers the fundamental question of how quickly the empirical measure obtained from independent samples from $\mu$ approaches $n$ in the Wasserstein distance of any order and proves sharp asymptotic and finite-sample results for this rate of convergence for general measures on general compact metric spaces.