Double/Debiased Machine Learning for Treatment and Structural Parameters

@article{Chernozhukov2017DoubleDebiasedML,
  title={Double/Debiased Machine Learning for Treatment and Structural Parameters},
  author={Victor Chernozhukov and Denis Chetverikov and Mert Demirer and Esther Duflo and Christian Hansen and Whitney Newey and James M. Robins},
  journal={Econometrics: Econometric \& Statistical Methods - General eJournal},
  year={2017}
}
We revisit the classic semiparametric problem of inference on a low dimensional parameter θ_0 in the presence of high-dimensional nuisance parameters η_0. We depart from the classical setting by allowing for η_0 to be so high-dimensional that the traditional assumptions, such as Donsker properties, that limit complexity of the parameter space for this object break down. To estimate η_0, we consider the use of statistical or machine learning (ML) methods which are particularly well-suited to… Expand
On doubly robust inference for double machine learning
Due to concerns about parametric model misspecification, there is interest in using machine learning to adjust for confounding when evaluating the causal effect of an exposure on an outcome.Expand
A unifying approach for doubly-robust $\ell_1$ regularized estimation of causal contrasts
We consider inference about a scalar parameter under a non-parametric model based on a one-step estimator computed as a plug in estimator plus the empirical mean of an estimator of the parameter'sExpand
Demystifying statistical learning based on efficient influence functions
Evaluation of treatment effects and more general estimands is typically achieved via parametric modelling, which is unsatisfactory since model misspecification is likely. Dataadaptive model buildingExpand
Variable Selection in Double/debiased Machine Learning for Causal Inference: An Outcome-Adaptive Approach
Access to high-dimensional data has made the use of machine learning in the causal inference more common in recent years. The double/debiased machine learning (DML) estimator for the treatment effectExpand
Double debiased machine learning nonparametric inference with continuous treatments
We propose a nonparametric inference method for causal effects of continuous treatment variables, under unconfoundedness and in the presence of high-dimensional or nonparametric nuisance parameters.Expand
The Bias-Variance Tradeoff of Doubly Robust Estimator with Targeted $L_1$ regularized Neural Networks Predictions
The Doubly Robust (DR) estimation of ATE can be carried out in 2 steps, where in the first step, the treatment and outcome are modeled, and in the second step the predictions are inserted into the DRExpand
High-dimensional doubly robust tests for regression parameters
After variable selection, standard inferential procedures for regression parameters may not be uniformly valid; there is no finite sample size at which a standard test is guaranteed to attain itsExpand
Double debiased machine learning nonparametric inference with continuous treatments
We propose a nonparametric inference method for causal effects of continuous treatment variables, under unconfoundedness and in the presence of high-dimensional or nonparametric nuisance parameters.Expand
Debiased Inference on Treatment Effect in a High-Dimensional Model
Abstract This article concerns the potential bias in statistical inference on treatment effects when a large number of covariates are present in a linear or partially linear model. While theExpand
Bias-aware model selection for machine learning of doubly robust functionals
While model selection is a well-studied topic in parametric and nonparametric regression or density estimation, model selection of possibly high dimensional nuisance parameters in semiparametricExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 116 REFERENCES
Variance estimation using refitted cross-validation in ultrahigh dimensional regression.
TLDR
A two-stage refitted procedure via a data splitting technique, called refitted cross-validation, to attenuate the influence of irrelevant variables with high spurious correlations is proposed and results show that the resulting procedure performs as well as the oracle estimator, which knows in advance the mean regression function. Expand
Approximate Residual Balancing: De-Biased Inference of Average Treatment Effects in High Dimensions.
There are many settings where researchers are interested in estimating average treatment effects and are willing to rely on the unconfoundedness assumption, which requires that the treatmentExpand
Valid Post-Selection and Post-Regularization Inference: An Elementary, General Approach
Here we present an expository, general analysis of valid post-selection or post-regularization inference about a low-dimensional target parameter, α, in the presence of a very high-dimensionalExpand
Post-Selection Inference for Generalized Linear Models With Many Controls
This article considers generalized linear models in the presence of many controls. We lay out a general methodology to estimate an effect of interest based on the construction of an instrument thatExpand
Minimax estimation of a functional on a structured high-dimensional model
We introduce a new method of estimation of parameters in semi-parametric and nonparametric models. The method is based on estimating equations that are U-statistics in the observations. TheExpand
Sparse Models and Methods for Optimal Instruments with an Application to Eminent Domain
We develop results for the use of LASSO and Post-LASSO methods to form first-stage predictions and estimate optimal instruments in linear instrumental variables (IV) models with many instruments, p,Expand
Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
TLDR
It is shown that for optimal sample size, n being at least of order s0 log(p/s0), the standard distributional limit for general Gaussian designs can be derived from the replica heuristics in statistical physics. Expand
High-dimensional instrumental variables regression and confidence sets
We propose an instrumental variables method for inference in high-dimensional structural equations with endogenous regressors. The number of regressors K can be much larger than the sample size. AExpand
Locally robust semiparametric estimation
We give a general construction of debiased/locally robust/orthogonal (LR) moment functions for GMM, where the derivative with respect to first step nonparametric estimation is zero and equivalentlyExpand
Post-Selection and Post-Regularization Inference in Linear Models with Many Controls and Instruments
TLDR
An approach to estimating structural parameters in the presence of many instruments and controls based on methods for estimating sparse high-dimensional models and extends Belloni, Chernozhukov and Hansen (2014), which covers selection of controls in models where the variable of interest is exogenous conditional on observables. Expand
...
1
2
3
4
5
...