Nonparametric Conditional Density Estimation in a High-Dimensional Regression Setting

@article{Izbicki2015NonparametricCD,
  title={Nonparametric Conditional Density Estimation in a High-Dimensional Regression Setting},
  author={Rafael Izbicki and Ann B. Lee},
  journal={Journal of Computational and Graphical Statistics},
  year={2015},
  volume={25},
  pages={1297 - 1316}
}
In some applications (e.g., in cosmology and economics), the regression is not adequate to represent the association between a predictor x and a response Z because of multi-modality and asymmetry of f(z|x); using the full density instead of a single-point estimate can then lead to less bias in subsequent analysis. As of now, there are no effective ways of estimating f(z|x) when x represents high-dimensional, complex data. In this article, we propose a new nonparametric estimator of f(z|x) that… 
Converting High-Dimensional Regression to High-Dimensional Conditional Density Estimation
There is a growing demand for nonparametric conditional density estimators (CDEs) in fields such as astronomy and economics. In astronomy, for example, one can dramatically improve estimates of the
Adaptive greedy algorithm for moderately large dimensions in kernel conditional density estimation
This paper studies the estimation of the conditional density f (x, ·) of Y i given X i = x, from the observation of an i.i.d. sample (X i , Y i) ∈ R d , i = 1,. .. , n. We assume that f depends only
Nonparametric method for sparse conditional density estimation in moderately large dimensions
TLDR
The method addresses several issues: being greedy and computationally efficient by an iterative procedure, avoiding the curse of high dimensionality under some suitably defined sparsity conditions by early variable selection during the procedure, converging at a quasi-optimal minimax rate.
Nonlinear Regression Estimation Using Subset-Based Kernel Principal Components
We study the estimation of conditional mean regression functions through the so-called subset-based kernel principal component analysis (KPCA). Instead of using one global kernel feature space, we
Conditional density estimation tools in python and R with applications to photometric redshifts and likelihood-free cosmological inference
TLDR
The goal of this work is to provide a comprehensive range of statistical tools and open-source software for nonparametric CDE and method assessment which can accommodate different types of settings and be easily fit to the problem at hand.
Distribution-free conditional predictive bands using density estimators
TLDR
Two conformal methods based on conditional density estimators that do not depend on this type of assumption to obtain asymptotic conditional coverage are introduced: Dist-split and CD-split.
Nonparametric Conditional Density Estimation In A Deep Learning Framework For Short-Term Forecasting
TLDR
This paper incorporates machine learning algorithms into a conditional distribution estimator for the purposes of forecasting tropical cyclone intensity and proposes a technique that simultaneously estimates the entire conditional distribution and flexibly allows for machine learning techniques to be incorporated.
Wasserstein Generative Learning of Conditional Distribution
TLDR
This work establishes non-asymptotic error bound of the conditional sampling distribution generated by the proposed method and shows that it is able to mitigate the curse of dimensionality, assuming that the data distribution is supported on a lower-dimensional set.
Photo-z Estimation: An Example of Nonparametric Conditional Density Estimation under Selection Bias
Redshift is a key quantity for inferring cosmological model parameters. In photometric redshift estimation, cosmologists use the coarse data collected from the vast majority of galaxies to predict
LinCDE: Conditional Density Estimation via Lindsey's Method
Conditional density estimation is a fundamental problem in statistics, with scientific and practical applications in biology, economics, finance and environmental studies, to name a few. In this
...
1
2
3
4
...

References

SHOWING 1-10 OF 88 REFERENCES
High-Dimensional Density Ratio Estimation with Extensions to Approximate Likelihood Computation
TLDR
This work proposes a simple-toimplement, fully nonparametric density ratio estimator that expands the ratio in terms of the eigenfunctions of a kernel-based operator; these functions reflect the underlying geometry of the data, often leading to better estimates without an explicit dimension reduction step.
Warped bases for conditional density estimation
We consider the problem of estimating the conditional density π of a response vector Y given the predictor X (which is assumed to be a continuous variable). We provide an adaptive nonparametric
Dimension Reduction and Adaptation in Conditional Density Estimation
An orthogonal series estimator of the conditional density of a response given a vector of continuous and ordinal/nominal categorical predictors is suggested. The estimator is based on writing a
Fast Nonparametric Conditional Density Estimation
TLDR
The double kernel conditional density estimator is described and fast dual-tree-based algorithms for bandwidth selection using a maximum likelihood criterion are derived, enabling the first applications to previously intractable large multivariate datasets, including a redshift prediction problem from the Sloan Digital Sky Survey.
Nonparametric Density Estimation: Toward Computational Tractability
TLDR
This paper presents an algorithm for kernel density estimation, the chief nonparametric approach, which is dramatically faster than previous algorithmic approaches in terms of both dataset size and dimensionality and is an instance of a new principle of algorithm design: multi-recursion, or higher-order algorithm design.
Strong consistency of the kernel estimators of conditional density function
Let (X, Y) be a ~ • RS-valued vector. Assume ~hat when X =~ ~ is given, ~her~ exists a conditional density of Y ~o be denoted by f ( y [ z ) , which is a Borel-measurable function of (~, y). Note
Random rates in anisotropic regression
In the context of minimax theory, we propose a new kind of risk, normalized by a random variable, measurable with respect to the data. We present a notion of optimality and a method to construct
k-NN Regression Adapts to Local Intrinsic Dimension
TLDR
The k-NN regression is shown to be adaptive to intrinsic dimension, and it is established that the minimax rate does not depend on a particular choice of metric space or distribution, but rather that this minimax rates holds for any metric space and doubling measure.
A kernel-based parametric method for conditional density estimation
TLDR
Experimental results show that the proposed method outperforms the Nadaraya-Watson estimator in terms of revised mean integrated squared error (RMISE) and is an effective method for estimating the conditional densities.
Cross-Validation and the Estimation of Conditional Probability Densities
Many practical problems, especially some connected with forecasting, require nonparametric estimation of conditional densities from mixed data. For example, given an explanatory data vector X for a
...
1
2
3
4
5
...