M Robust Weighted Ridge Estimator in Linear Regression Model

Authors

  • Taiwo Stephen Fayose Department of Statistics, Federal Polytechnic Ado-Ekiti, Nigeria
  • Kayode Ayinde Department of Statistics, Federal University of Technology, Akure, Nigeria
  • Olatayo Olusegun Alabi Department of Statistics, Federal University of Technology, Akure, Nigeria

Keywords:

Linear regression model, Multicollinearity, M Estimator, Heteroscedasticity

Abstract

Correlated regressors are a major threat to the performance of the conventional ordinary least squares (OLS) estimator. The ridge estimator provides more stable estimates in this circumstance. However, both OLS and Ridge estimators are sensitive to outlying observations. In previous studies, the robust ridge based on the M estimator suitably fit well to the model with multicollinearity and outliers in outcome variable. Since outlier is one of the sources of heteroscedasticity and the non – robust weighted least squares have been previously adopted to account for it. Therefore, this paper proposed and developed some novel estimators to handle three problems (multicollinearity, heteroscedasticity and outliers) simultaneously with the aim of identifying the most efficient (best) ones. The Ordinary Least Square (OLS) and M robust estimator in two weighted version (real weight (WO) and one estimated weight (W1)) were combined to respectively develop the M Robust Ridge and M Robust Weighted Ridge Estimators. Monte–Carlo simulation experiments were conducted on a linear regression model with three and six explanatory variables exhibiting different degrees of Multicollinearity, with heteroscedasticity structure of powers, magnitude of outlier in y direction and error variances and five levels of sample sizes. The Mean Square Error (MSE) was used as a criterion to evaluate the performances of the new and existing estimators. It is evident from the simulation results that the proposed estimators produced efficient estimates and hereby recommended.

Dimensions

A. F. Lukman & K. Ayinde, “Review and Classifications of the Ridge Parameter Estimation Techniques”, Hacettepe Journal of Mathematics and Statistics 46 (2017) 953.

A. T. Owolabi, K. Ayinde, J. I. Idowu, O. J. Oladapo & A. F. Lukman, “New Two – Parameter Estimator in the Linear Regression Model with Correlated Regressors”, J. Stat. Appl. Pro. 11 (2022) 499.

I. Dawoud, A. F. Lukman & A. Haadi, “A new biased regression estimator: Theory, simulation, and application”, Scientific African 15 (2022) e01100. https://doi.org/10.1016/j.sciaf.2022.e01100.

S. Chatterjee & A. S. Hadi. Regression Analysis by Example, Wiley, Hoboken, NJ, USA (1977).

G. A. Shewa & F. I. Ugwuowo, “Kibria – Lukman type Estimator for Gamma Regression Model”, Concurrency Computat Pract Exper. 10 (2022) e7741. https://doi.org/10.1002/cpe.7441

G. A. Shewa & F. I. Ugwuowo, “Combating the Multicollinearity in Bell Regression Model: Simulation and Application”, Journal of the Nigerian Society of Physical Sciences 4 (2022) 713. https://doi.org/10.46481/jnsps.2022.713

N. D. Gujarati, C. D. Porter & S. Gunasekar, Basic Econometric. (5th Edition), New Delhi: Tata McGraw – Hill (2012).

J. P. Buonaccorsi. “A Modified Estimating Equation Approach to Correcting for Measurement Error in Regression”, Biometrika 83 (1996) 433.

K. Liu, “A New Class of Biased Estimate in Linear Regression”, Communications in Statistics – Theory and Methods 22 (1993) 393.

A. E. Hoerl & R. W. Kennard, “Ridge Regression: Biased Estimation for Non-Orthogonal Problems”, Technometrics 12 (1970) 55.

S. Wold, M. Sjostrom & L. Eriksson, “PLS – Regression: A Basic Tool for Chemometrics”, Chemometrics and Intelligent Laboratory Systems 58 (2001) 109.

W. F. Massy, “Principal Component Regression in Exploratory Statistical Research”, Journal of the American Statistical Association 60 (1965) 234.

K. Liu, “Using Liu – type estimator to combat Collinearity”, Commun Stat Theor Meth. 32 (2003) 1009.

A. F. Lukman, K. Ayinde, S. Binuomote & A. C. Onate, “Modified Ridge – Type Estimator to combat Multicollinearity: Application to Chemical Data”, Journal of Chemometrics 10 (2019) e3125. https://doi.org/10.1002/cem.3125

A. F. Lukman, G. B. M. Kibria, K. Ayinde & S. L. Jegede, “Modified One – Parameter Liu Estimator for the Linear Regression Model”, Modelling and Simulation in Engineering 10 (2020) 9574304. https://doi.org/10.1155/2020/9574304

G. B. M. Kibria & A. F. Lukman. A New Ridge – Type Estimator for the Linear Regression Model: Simulations and Applications. Scientifica. 10 (2020) https://doi.org/10.1155/2020/9758378

F. I. Ugwuowo, E. H. Oranye & K. C. Arum, “On the Jackknife Kibria – Lukman Estimator for the Linear Regression Model”, Communications in Statistics – Simulation and Computation 10 (2021) 1. https://doi.org/10.1080/03610918.2021.2007401

S. Chatterjee & A. S. Hadi, “Influential Observations, High Leverage Points, and Outliers in Linear Regression”, Statistical Science 1 (1986) 379.

D. C. Montgomery, E. A. Peck & G. G. Vining, Introduction to Linear Regression Analysis, 3rd edition, John Wiley and Sons, New York (2006).

N. H. Jadhav & D. N. Kashid, “A Jackknifed Ridge M – Estimator for Regression Model with Multicollinearity and Outliers”, Journal of Statistical Theory and Practice 54 (2011) 659. https://doi.org/10.1080/15598608.2011.10483737

K. C. Arum & F. I. Ugwuowo, “Combining Principal Component and Robust Ridge Estimators in Linear Regression Model with Multicollinearity and Outlier”, Concurrency Computat Pract Exper. 11 (2022) e6803.

P. J. Huber, “Robust Regression: Asymptotics, Conjectures and Monte Carlo”, Annals of Statistics 1 (1973) 799. http://dx.doi.org/10.1214/aos/1176342503

V. Yohai, “High Breakdown Point and High Efficiency Robust Estimates for Regression”, The Annals of Statistics 15 (1987) 642.

P. J. Rousseeuw & A. M. Leroy, Robust Regression and Outlier Detection, New York, Wiley – Interscience (1987).

P. J. Rousseeuw, “Least Median of Squares Regression”, Journal of the American Statistical Association 79 (1984) 871.

G. Bassett & R. Koenker, “Asymptotic Theory of Least Absolute Error Regression”, J. Amer. Statist. Assoc. 73 (1978) 618.

P. J. Rousseeuw, M. Hubert & A. Stefan, “High Breakdown Multivariate Methods”, Statistical Science 23 (1984) 92.

N. H. Jadhav & D. N. Kashid, “A Jackknifed Ridge M-Estimator for Regression Model with Multicollinearity and Outliers”, Journal of Statistical Theory and Practice 5 (2011) 659. https://doi.org/10.1080/15598608.2011.10483737

H. Ertasˇz S. Kacıranlar & H. Guler, “Robust Liu – type Estimator for Regression based on M – estimator”, Communications in Statistics - Simulation and Computation 46 (2015) 1. https://doi.org/10.1080/03610918.2015.1045077

H. Ertas, “A Modified Ridge M – Estimator for Linear Regression Model with Multicollinearity and Outliers”, Communications in Statistics - Simulation and Computation 47 (2018) 1240. https://doi.org/10.1080/03610918.2017.1310231

A. F. Lukman, K. Ayinde, G. B. M. Kibria & S. L. Jegede, “Two – Parameter Modified Ridge – Type M – Estimator for Linear Regression Model”, The Scientific World Journal 10 (2020) 1. https://doi.org/10.1155/2020/3192852

M. J. Silvapulle, “Robust Ridge Regression based on an M – Estimator”, Australian Journal of Statistics 33 (1991) 319. https://doi.org/10.1111/j.1467-842X.1991.tb00438.x

I. Dawoud & M. Abonazel, “Robust Dawoud – Kibria Estimator for handling Multicollinearity and Outliers in the Linear Regression Model”, Journal of Statistical Computation and Simulation 10 (2021) 1. https://doi.org/10.1080/00949655.2021.1945063.

A. Majid, S. Ahmad, M. Aslam & M. Kashif, “A Robust Kibria – Lukman Estimator for Linear Regression model to combat Multicollinearity and Outliers”, Concurrency and Computation Practice and Experience 10 (2022) 1. https://doi.org/10.1002/cpe.7533.

D. N. Gujarati, Basic Econometrics, McGraw Hill, New York (2003).

T. S. Fayose & K. Ayinde, “Different Forms Biasing Parameter for Generalized Ridge Regression Estimator”, International Journal of Computer Applications 181 (2019) 21.

S. Larson, “The Shrinkage of the Coefficient of Multiple Correlation”, Journal of Educational Psychology 22 (1931) 45.

F. Mosteller & J. W. Turkey, Data Analysis including Statistics, In Handbook of Social Psychology, Addison – Wesley, Reading. MA (1968).

E. A. Nadaraya, “On Estimating Regression”, Theory of Probability and its Applications 9 (1964) 141.

N. A. Alao, K. Ayinde & G. S. Solomon, “Comparative Study on Sensitivity of Multivariate Tests of Normality to Outliers”, A SMSc J. 12 (2019) 65.

S. Chatterjee, A. S. Hadi & B. Price, Regression Analysis by Example, 3rd Edition, A Wiley – Interscience Publication, John Wiley and Sons (2000).

A. F. Lukman, I. Dawoud & A. R. Haadi, “A New Biased Regression Estimator: Theory, Simulation and Application”, Scientific African 15 (2022) 2468. https://doi.org/10.1016/j.sciaf.2022.e01100

A. Yasin & M. Erisoglu, “Influence Diagnostics in Two – Parameter Ridge Regression”, Journal of Data Science 14 (2016) 33e51.

Published

2023-08-29

How to Cite

M Robust Weighted Ridge Estimator in Linear Regression Model. (2023). African Scientific Reports, 2(2), 123. https://doi.org/10.46481/asr.2023.2.2.123

Issue

Section

Original Research

How to Cite

M Robust Weighted Ridge Estimator in Linear Regression Model. (2023). African Scientific Reports, 2(2), 123. https://doi.org/10.46481/asr.2023.2.2.123