Learn More
1 This paper considers tests of the parameter on an endogenous variable in an instrumental variables regression model. The focus is on determining tests that have some optimal power properties. We start by considering a model with normally distributed errors and known error covariance matrix. We consider tests that are similar and satisfy a natural(More)
1 This paper considers testing problems where several of the standard regularity condi-Ž. tions fail to hold. We consider the case where i parameter vectors in the null hypothesis Ž. may lie on the boundary of the maintained hypothesis and ii there may be a nuisance parameter that appears under the alternative hypothesis, but not under the null. The paper(More)
1 This paper considers the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p-values, and bias correction. For each of these problems, the paper provides a three-step method for choosing B to achieve a desired level of accuracy. Accuracy is measured by the(More)
The topic of this paper is inference in models in which parameters are defined by moment inequalities and/or equalities. The parameters may or may not be identified. This paper introduces a new class of confidence sets and tests based on generalized moment selection (GMS). GMS procedures are shown to have correct asymptotic size in a uniform sense and are(More)
1 Ž. This paper considers a generalized method of moments GMM estimation problem in which one has a vector of moment conditions, some of which are correct and some incorrect. The paper introduces several procedures for consistently selecting the correct moment conditions. The procedures also can consistently determine whether there is a sufficient number of(More)
1 The local Whittle (or Gaussian semiparametric) estimator of long range dependence , proposed by Künsch (1987) and analyzed by Robinson (1995a), has a relatively slow rate of convergence and a finite sample bias that can be large. In this paper, we generalize the local Whittle estimator to circumvent these problems. Instead of approximating the short-run(More)
Deep Convolutional Neural Network (CNN) enforces supervised information only at the output layer, and hidden layers are trained by back propagating the prediction error from the output layer without explicit supervision. We propose a supervised feature learning approach, Label Consistency Neural Network, which enforces direct supervision in late hidden(More)