Deep learning for individual heterogeneity: an automatic inference framework

@inproceedings{Farrell2021DeepLF,
  title={Deep learning for individual heterogeneity: an automatic inference framework},
  author={M. Farrell and Tengyuan Liang and S. Misra},
  year={2021}
}
We develop methodology for estimation and inference using machine learning to enrich economic models. Our framework takes a standard economic model and recasts the parameters as fully flexible nonparametric functions, to capture the rich heterogeneity based on potentially high dimensional or complex observable characteristics. These “parameter functions” retain the interpretability, economic meaning, and discipline of classical parameters. In contrast to common implementations of machine… 

Figures and Tables from this paper

Double debiased machine learning nonparametric inference with continuous treatments

We propose a nonparametric inference method for causal effects of continuous treatment variables, under unconfoundedness and in the presence of high-dimensional or nonparametric nuisance parameters.

2 Setting : Linear Regression and Prediction for CATE 2 . 1 Conditional Average Treatment Effect

TLDR
This study considers samples whose distribution switches depending on an assignment rule, and study the prediction of CATE with linear models whose dimension diverges to infinity, which provides new insights into the usage of causal inference methods in the overparameterizated setting, in particular, doubly robust estimators.

Empirical Gateaux Derivatives for Causal Inference

We study a constructive algorithm that approximates Gateaux derivatives for statistical functionals by finite-differencing, with a focus on causal inference functionals. We consider the case where

Generalized Lee Bounds

Lee (2009) is a common approach to bound the average causal effect in the presence of selection bias, assuming the treatment effect on selection has the same sign for all subjects. This paper

References

SHOWING 1-10 OF 93 REFERENCES

Deep Structural Estimation: With an Application to Option Pricing

We propose a novel structural estimation framework in which we train a surrogate of an economic model with deep neural networks. Our methodology alleviates the curse of dimensionality and speeds up

Deep Neural Networks for Estimation and Inference

TLDR
This work studies deep neural networks and their use in semiparametric inference, and establishes novel nonasymptotic high probability bounds for deep feedforward neural nets for a general class of nonparametric regression‐type loss functions.

Regularised orthogonal machine learning for nonlinear semiparametric models

This paper proposes a Lasso-type estimator for a high-dimensional sparse parameter identified by a single index conditional moment restriction (CMR). In addition to this parameter, the moment

Quasi-oracle estimation of heterogeneous treatment effects

TLDR
This paper develops a general class of two-step algorithms for heterogeneous treatment effect estimation in observational studies that have a quasi-oracle property, and implements variants of this approach based on penalized regression, kernel ridge regression, and boosting, and find promising performance relative to existing baselines.

Estimating Parameters of Structural Models Using Neural Networks

TLDR
This work shows this Neural Net Estimator (NNE) converges to Bayesian parameter posterior when the number of training datasets is sufficiently large, and examines the performance of NNE in two Monte Carlo studies.

Inference in Additively Separable Models With a High-Dimensional Set of Conditioning Variables

  • D. Kozbur
  • Mathematics
    Journal of Business & Economic Statistics
  • 2020
Abstract This article studies nonparametric series estimation and inference for the effect of a single variable of interest x on an outcome y in the presence of potentially high-dimensional

Double debiased machine learning nonparametric inference with continuous treatments

We propose a nonparametric inference method for causal effects of continuous treatment variables, under unconfoundedness and in the presence of high-dimensional or nonparametric nuisance parameters.

Uncertainty Quantification for Sparse Deep Learning

TLDR
This paper provides semi-parametric Bernstein-von Mises theorems for linear and quadratic functionals, which guarantee that implied Bayesian credible regions have valid frequentist coverage and provides new theoretical justifications for (Bayesian) deep learning with ReLU activation functions.

Post-Selection Inference for Generalized Linear Models With Many Controls

This article considers generalized linear models in the presence of many controls. We lay out a general methodology to estimate an effect of interest based on the construction of an instrument that

Chapter 76 Large Sample Sieve Estimation of Semi-Nonparametric Models

...