A Deep Generative Approach to Conditional Sampling

@article{Zhou2021ADG,
  title={A Deep Generative Approach to Conditional Sampling},
  author={Xingyu Zhou and Yuling Jiao and Jin Liu and Jian Huang},
  journal={Journal of the American Statistical Association},
  year={2021}
}
We propose a deep generative approach to sampling from a conditional distribution based on a unified formulation of conditional distribution and generalized nonparametric regression function using the noise-outsourcing lemma. The proposed approach aims at learning a conditional generator so that a random sample from the target conditional distribution can be obtained by the action of the conditional generator on a sample drawn from a reference distribution. The conditional generator is… 
Wasserstein Generative Learning of Conditional Distribution
TLDR
This work establishes non-asymptotic error bound of the conditional sampling distribution generated by the proposed method and shows that it is able to mitigate the curse of dimensionality, assuming that the data distribution is supported on a lower-dimensional set.

References

SHOWING 1-10 OF 52 REFERENCES
Generative Adversarial Nets
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a
f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization
TLDR
It is shown that any f-divergence can be used for training generative neural samplers and the benefits of various choices of divergence functions on training complexity and the quality of the obtained generative models are discussed.
Converting High-Dimensional Regression to High-Dimensional Conditional Density Estimation
There is a growing demand for nonparametric conditional density estimators (CDEs) in fields such as astronomy and economics. In astronomy, for example, one can dramatically improve estimates of the
Approximating conditional distribution functions using dimension reduction
Motivated by applications to prediction and forecasting, we sug- gest methods for approximating the conditional distribution function of a random variable Y given a dependent random d-vector X. The
Robust Nonparametric Regression with Deep Neural Networks
In this paper, we study the properties of robust nonparametric estimation using deep neural networks for regression models with heavy tailed error distributions. We establish the non-asymptotic error
Nonparametric Conditional Density Estimation in a High-Dimensional Regression Setting
In some applications (e.g., in cosmology and economics), the regression is not adequate to represent the association between a predictor x and a response Z because of multi-modality and asymmetry of
Adaptive Approximation and Estimation of Deep Neural Network to Intrinsic Dimensionality
TLDR
It is theoretically proved that the generalization performance of deep neural networks (DNNs) is mainly determined by an intrinsic low-dimensional structure of data, and DNNs outperform other non-parametric estimators which are also adaptive to the intrinsic dimension.
Deep Quantile Regression: Mitigating the Curse of Dimensionality Through Composition
This paper considers the problem of nonparametric quantile regression under the assumption that the target conditional quantile function is a composition of a sequence of low-dimensional functions.
Representation Learning: A Review and New Perspectives
TLDR
Recent work in the area of unsupervised feature learning and deep learning is reviewed, covering advances in probabilistic models, autoencoders, manifold learning, and deep networks.
Least-Squares Conditional Density Estimation
TLDR
A novel method of conditional density estimation that is suitable for multi-dimensional continuous variables and expresses the conditional density in terms of the density ratio and the ratio is directly estimated without going through density estimation.
...
1
2
3
4
5
...