An Easy-to-Implement Hierarchical Standardization for Variable Selection Under Strong Heredity Constraint

@article{Chen2020AnEH,
  title={An Easy-to-Implement Hierarchical Standardization for Variable Selection Under Strong Heredity Constraint},
  author={Kedong Chen and W. Li and Sijian Wang},
  journal={Journal of Statistical Theory and Practice},
  year={2020},
  volume={14},
  pages={1-32}
}
For many practical problems, the regression models follow the strong heredity property (also known as the marginality), which means they include parent main effects when a second-order effect is present. Existing methods rely mostly on special penalty functions or algorithms to enforce the strong heredity in variable selection. We propose a novel hierarchical standardization procedure to maintain strong heredity in variable selection. Our method is effortless to implement and is applicable to… 
1 Citations

A new modified Lindley distribution with properties and applications

Abstract This paper introduces a new one-parameter distribution derived from the Lindley distribution, called the modified Lindley distribution. Its main feature is to operate a simple trade-off

References

SHOWING 1-10 OF 35 REFERENCES

Variable Selection With the Strong Heredity Constraint and Its Oracle Property

Numerical results indicate that the LASSO method tends to remove irrelevant variables more effectively and provide better prediction performance than previous work and automatically enforces the heredity constraint.

An Efficient Variable Selection Approach for Analyzing Designed Experiments

This work proposes an efficient variable selection strategy to specifically address the unique challenges faced by analysis of experiments and can be computed very rapidly and can find sparse models that better satisfy the goals of experiments.

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.

Regression Shrinkage and Selection via the Lasso

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

Functionally Induced Priors for the Analysis of Experiments

This work develops the idea of using functional priors for the design and analysis of three-level and higher-level experiments by proposing appropriate correlation functions and coding schemes so that the prior distribution is simple and the results are interpretable.

The composite absolute penalties family for grouped and hierarchical variable selection

CAP is shown to improve on the predictive performance of the LASSO in a series of simulated experiments, including cases with $p\gg n$ and possibly mis-specified groupings, and iCAP is seen to be parsimonious in the experiments.

Least angle regression

A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.

Using Definitive Screening Designs to Identify Active First- and Second-Order Factor Effects

The ability of DSDs to correctly identify first- and second-order model terms is characterized as a function of the level of sparsity, the number of factors in the design, the signal-to-noise ratio, the model type (unrestricted or following strong heredity), the model-selection technique, and thenumber of augmented runs.

Regularities in data from factorial experiments

A meta-analysis of 113 data sets from published factorial experiments shows that a preponderance of active two-factor interaction effects are synergistic, meaning that when main effects are used to increase the system response, the interaction provides an additional increase and that when the interactions generally counteract the main effects.

Hierarchical Variable Selection in Polynomial Regression Models

A theory of the hierarchical ordering of the predictors of an arbitrary polynomial regression model in m variables, where m is any arbitrary positive integer, is proposed and an algorithm that generates all possible well-formulated models is presented.