• Corpus ID: 238743795

# Robust Generalized Method of Moments: A Finite Sample Viewpoint

@article{Rohatgi2021RobustGM,
title={Robust Generalized Method of Moments: A Finite Sample Viewpoint},
author={Dhruv Rohatgi and Vasilis Syrgkanis},
journal={ArXiv},
year={2021},
volume={abs/2110.03070}
}
• Published 6 October 2021
• Computer Science, Mathematics, Economics
• ArXiv
For many inference problems in statistics and econometrics, the unknown parameter is identified by a set of moment conditions. A generic method of solving moment conditions is the Generalized Method of Moments (GMM). However, classical GMM estimation is potentially very sensitive to outliers. Robustified GMM estimators have been developed in the past, but suffer from several drawbacks: computational intractability, poor dimension-dependence, and no quantitative recovery guarantees in the…
1 Citations

## Figures from this paper

Efficient multivariate low-degree tests via interactive oracle proofs of proximity for polynomial codes
• Computer Science
Electron. Colloquium Comput. Complex.
• 2021
The first interactive oracle proofs of proximity (IOPP) for tensor products of Reed-Solomon codes and for Reed-Muller codes (evaluation of polynomials with bounds on individual degrees) are presented.

## References

SHOWING 1-10 OF 18 REFERENCES
Robust inference with GMM estimators
• Mathematics
• 2001
The local robustness properties of generalized method of moments (GMM) estimators and of a broad class of GMM based tests are investigated in a unified framework. GMM statistics are shown to have
Robust Regression Revisited: Acceleration and Improved Estimation Rates
• Computer Science, Mathematics
ArXiv
• 2021
An identifiability proof introduced in the context of the sum-of-squares algorithm of [BP21], which achieved optimal error rates while requiring large polynomial runtime and sample complexity is reinterpreted within the Sever framework and obtained a dramatically faster and more sample-efficient algorithm under fewer distributional assumptions.
An Automatic Finite-Sample Robustness Metric: Can Dropping a Little Data Change Conclusions?
• Mathematics
• 2020
We propose a method to assess the sensitivity of econometric analyses to the removal of a small fraction of the sample. Analyzing all possible data subsets of a certain size is computationally
A natural robustification of the ordinary instrumental variables estimator.
• Mathematics, Medicine
Biometrics
• 2013
This work constructs RIV using a robust multivariate location and scatter S-estimator to robustify the solution of the estimating equations that define OIV, and endow RIV with an iterative algorithm which allows for the estimation of models with endogenous continuous covariates and exogenous dummy covariates.
Robust Estimators in High Dimensions without the Computational Intractability
• Computer Science, Mathematics
2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)
• 2016
This work obtains the first computationally efficient algorithms for agnostically learning several fundamental classes of high-dimensional distributions: a single Gaussian, a product distribution on the hypercube, mixtures of two product distributions (under a natural balancedness condition), and k Gaussians with identical spherical covariances.
Being Robust (in High Dimensions) Can Be Practical
• Computer Science, Mathematics
ICML
• 2017
This work addresses sample complexity bounds that are optimal, up to logarithmic factors, as well as giving various refinements that allow the algorithms to tolerate a much larger fraction of corruptions.
Efficient Algorithms and Lower Bounds for Robust Linear Regression
• Computer Science, Mathematics
SODA
• 2019
Any polynomial time SQ learning algorithm for robust linear regression (in Huber's contamination model) with estimation complexity, must incur an error of $\Omega(\sqrt{\epsilon} \sigma)$.
The Influence Curve and Its Role in Robust Estimation
Abstract This paper treats essentially the first derivative of an estimator viewed as functional and the ways in which it can be used to study local robustness properties. A theory of robust
Sever: A Robust Meta-Algorithm for Stochastic Optimization
• Computer Science, Mathematics
ICML
• 2019
This work introduces a new meta-algorithm that can take in a base learner such as least squares or stochastic gradient descent, and harden the learner to be resistant to outliers, and finds that in both cases it has substantially greater robustness than several baselines.
Two Stage Least Absolute Deviations Estimators
In this paper the method of least absolute deviations is applied to the estimation of the parameters of a structural equation in the simultaneous equations model. A class of estimators called two