Economics

Variance Inflation Factor (VIF)

Variance Inflation Factor (VIF)

In regression analysis, the Variance Inflation Factor (VIF) assesses the degree of multicollinearity. It’s the ratio (quotient) between the variance of estimating a parameter in a model with several other terms (parameters) and the variance of a model with only one term. The VIF for a regression model variable is equal to the ratio of the total model variance to the variance of a model that just includes that single independent variable in mathematics.

The VIF is an index that evaluates how much collinearity increases the variance (square of the estimate’s standard deviation) of an estimated regression coefficient. For each independent variable, this ratio is computed. It calculates how much the variance of a regression coefficient is inflated as a result of the model’s multicollinearity. A high VIF implies that the related independent variable has a high degree of collinearity with the model’s other variables. The variance and type II error are inflated by multicollinearity. It makes a variable’s coefficient consistent yet unreliable.

A significant variance inflation factor (VIF) on an independent variable implies a highly collinear relationship with the other variables, which should be taken into account or accounted for in the model’s structure and independent variable selection. Multicollinearity occurs in ordinary least square (OLS) regression analysis when two or more of the independent variables have a linear connection. Market capitalizations and revenues, for example, are the independent variables in a regression model that examines the link between business sizes and revenues and stock prices.

A variance inflation factor is a technique that may be used to determine the degree of multicollinearity in a dataset. When a person wishes to examine the influence of several factors on a certain result, they employ a multiple regression. The market capitalization and total revenue of a firm are highly linked. A company’s size rises in tandem with its revenue growth. In the OLS regression analysis, this causes a multicollinearity issue. Consider the linear model below, which has k independent variables.:

Y = β0 + β1 X1 + β2 X2 + … + βk Xk + ε

The square root of the j + 1 diagonal element of s2(X′X)-1 is the standard error of the estimate of βj, where s is the root mean squared error (RMSE) (note that RMSE2 is a consistent estimator of the true variance of the error term, σ2; X is the regression design matrix, a matrix such that Xi, j+1 is the value of the jth independent variable for the ith case or observation, and Xi, The dependent variable is the result that is influenced by the independent variables, which are the model’s inputs.

When one or more independent variables or inputs have a linear connection, or correlation, multicollinearity exists. The regression coefficients are still compatible with multicollinearity, but they are no longer trustworthy since the standard errors are exaggerated. In statistical terminology, a multicollinear multiple regression model makes estimating the connection between each of the independent variables and the dependent variable more complex.

The square root of the variance inflation factor indicates how much the standard error grows when compared to when the variable has no connection to the other predictor variables in the model. Small changes in the data or the model equation’s structure can result in significant, unpredictable variations in the estimated coefficients on the independent variables. Another widely used method for detecting multicollinearity in a regression model is VIF. It determines how much collinearity has inflated the variance (or standard error) of the predicted regression coefficient.

Variance inflation factors provide a fast estimate of how much a variable contributes to the regression’s standard error. Multicollinearity has no detrimental effects when large VIFs are produced by the inclusion of the products or powers of other variables. It becomes difficult, if not impossible, to determine which variable has the greatest impact on the dependent variable. This is an issue since many econometric models are designed to examine precisely this type of statistical relationship between the independent and dependent variables.

Multicollinearity inflates coefficient variance and produces type II errors, therefore detecting and correcting it is critical. Multicollinearity may be corrected in two ways, both of which are easy and widely utilized:

  • One (or more) of the highly linked variables should be removed initially. Because the variables offer duplicate information, their elimination will not have a significant impact on the coefficient of determination.
  • Instead of OLS regression, the second technique is to utilize principle components analysis (PCA) or partial least square regression (PLS). PLS regression may condense a large number of variables into a smaller number with no association between them. PCA generates new uncorrelated variables. It reduces information loss and enhances a model’s predictability.

Information Sources:

  1. investopedia.com
  2. corporatefinanceinstitute.com
  3. wikipedia