Abstract:
Multicollinearity and autocorrelation problems pose a major challenge to the Ordinary Least Squares (OLS) estimate. The purpose of the study is to make a comparative analysis of different regularized and robust regression methods to determine the regression method that best addresses multicollinearity and autocorrelation problems in a linear model. The regularized and robust regression methods used in the study include Ridge Regression (RR), Lasso Regression, Two-Stage Ridge Regression (TR), Two-Stage Lasso Regression (TLasso), Quantile Regression (QR), Ridge Quantile Regression (RQR), Lasso Quantile Regression (LQR), Two-Stage Ridge Quantile Regression (TRQR) and Two-Stage Lasso Quantile Regression (TLQR), as remedies to OLS estimate. The data used for the study consists of simulated and two real datasets with multicollinearity and autocorrelation issues. The Mean Squared Error as the main performance criterion as well as other statistical values (such as regression coefficients, R-squared, adjusted R-squared, Root Mean Squared Error) were used for comparing the performances of the regression methods. The results indicate that the TLQR method with 0.5 quantile level is suitable for handling multicollinearity and autocorrelation problems with many predictor variables. The TR method performs better with few predictor variables. Another important observation is the effect of sample sizes on the regression methods. To effectively build a good model, one should aim at choosing the appropriate samples and predictors with corresponding right regression method for data studies which include possible multicollinearity and autocorrelation issues.