A Caution Regarding Rules of Thumb for Variance Inflation Factors

  title={A Caution Regarding Rules of Thumb for Variance Inflation Factors},
  author={Robert M. O’Brien},
  journal={Quality \& Quantity},
  • R. O’Brien
  • Published 13 March 2007
  • Mathematics
  • Quality & Quantity
The Variance Inflation Factor (VIF) and tolerance are both widely used measures of the degree of multi-collinearity of the ith independent variable with the other independent variables in a regression model. Unfortunately, several rules of thumb – most commonly the rule of 10 – associated with VIF are regarded by many practitioners as a sign of severe or serious multi-collinearity (this rule appears in both scholarly articles and advanced statistical textbooks). When VIF reaches these threshold… 

Overcoming the inconsistences of the variance inflation factor: a redefined VIF and a test to detect statistical troubling multicollinearity

Multicollinearity is relevant to many different fields where linear regression models are applied, and its existence may affect the analysis of ordinary least squares (OLS) estimators from both the

Detection of Multicollinearity Using Min-Max and Point-Coordinates Approach

Using Min-Max approach with the principle of parallelism of coordinates, this paper is able to present an algorithm for the detection of multicollinearity among variables no matter their numbers.

Collinearity diagnostic applied in ridge estimation through the variance inflation factor

ABSTRACT The variance inflation factor (VIF) is used to detect the presence of linear relationships between two or more independent variables (i.e. collinearity) in the multiple linear regression

The detention and correction of multicollinearity effects in a multiple regression diagnostics

Introduction Statement of problem Multiple regressions is most effect at identifying the relationship between a dependent variable and a combination of several predictors when its underlying

Solutions to multicollinearity diagnostics

Introduction Introduction Multiple regressions are most effective at identifying the relationship between a dependent variable and a combination of several predictors when its underlying regression

The red indicator and corrected VIFs in generalized linear models

  • M. Özkale
  • Mathematics
    Commun. Stat. Simul. Comput.
  • 2021
Abstract Investigators that seek to employ regression analysis usually encounter the problem of multicollinearity with dependency on two or more explanatory variables. Multicollinearity is associated

The Vector Geometric Approach to Multicollinearity Diagnostics

The problems of multicollinearity among the independent variables in least-squares regression are by now well-known and published. In the presence of multi-collinearity problem, the parameter

Determination of lag threshold on the measure of collinearity

To date, the implementation of lag selection procedures within the context of unit root tests is largely based on application of standard information criteria which are well-known to have lag

The Raise Regression: Justification, properties and application

Multicollinearity produces an inflation in the variance of the Ordinary Least Squares estimators due to the correlation between two or more independent variables (including the constant term). A

Collinearity: revisiting the variance inflation factor in ridge regression

Ridge regression has been widely applied to estimate under collinearity by defining a class of estimators that are dependent on the parameter k. The variance inflation factor (VIF) is applied to



Issues in Multiple Regression

  • R. Gordon
  • Business
    American Journal of Sociology
  • 1968
Controlling for variables implies conceptual distinctness between the control and zero-order variables. However, there are different levels of distinctness, some more subtle than others. These levels

Classical F-Tests and Confidence Regions for Ridge Regression

For testing general linear hypotheses in multiple regression models. it is shown that non-stochastically shrunken ridge estimators yield the same central F-ratios and t-statistics as does the least

Comment: Effect of Centering on Collinearity and Interpretation of the Constant

The decision to center or not to center the data in linear least squares depends solely on the substantive meaning of the data. We can give hundreds of examples of data for which centering (or

Comment: Collinearity Diagnostics Depend on the Domain of Prediction, the Model, and the Data

The issue of mean-centering has generated a lot of discussion. The point of view described by Belsley in his current article and in Belsley, Kuh, and Welsch (1980), in contrast with Marquardt and

Regression Analysis: Statistical Modeling of a Response Variable

This paper presents a meta-modelling procedure called Estimation Procedures, which automates the very labor-intensive and therefore time-heavy and expensive process of manually estimating the coefficients of linear regression in a discrete-time model.

Applied Regression Analysis, Linear Models, and Related Methods

PART ONE: PRELIMINARIES Statistics and Social Science What Is Regression Analysis? Examining Data Transforming Data PART TWO: LINEAR MODELS AND LEAST SQUARES Linear Least-Squares Regression

A Guide to Econometrics

The fourth edition of "A Guide to Econometrics" provides an overview of the subject and an intuitive feel for its concepts and techniques without the notation and technical detail often characteristic of econometric textbooks.

Statistical Design and Analysis of Experiments, with Applications to Engineering and Science

Preface. PART I: FUNDAMENTAL STATISTICAL CONCEPTS. Statistics in Engineering and Science. Fundamentals of Statistical Inference. Inferences on Means and Standard Deviations. PART II: DESIGN AND

Regression Diagnostics: Identifying Influential Data and Sources of Collinearity

This chapter discusses Detecting Influential Observations and Outliers, a method for assessing Collinearity, and its applications in medicine and science.

An Investigation of Real Versus Perceived CSP in S&P-500 Firms

Firms are spending billions annually in the name of corporate social responsibility (CSR). Whilst markets are increasingly willing to reward good and responsible firms, they lack the instruments to