Selecting the Optimal Value of Penalty Parameter K in Ridge Regression Estimators
Keywords:
Multicollinearity, Variance Inflation Factor (VIF), Shrinkage estimator, Ridge regression, Penalty parameter (k)Abstract
Ridge regression is one of the popular parameter estimations techniques used to address the multicollinearity problem frequently arising in multiple linear regression. The ridge estimator is based on controlling the magnitude of regression coefficients. The Ridge regression constrains the sum of the absolute values of the regression coefficients to be less than some constant C which is called the penalty. The Ridge regression shrinks the ordinary least squares estimation vector of regression coefficients towards the origin, allowing a bias but providing a smaller variance. However, the choice of the optimal value of penalty parameter k in Ridge Regression estimators is critical. A Simulation study is conducted to uncover the optimal value of the penalty parameter k under different settings. This simulation study is novel in the field of Ridge Regression Estimators, and it increases the effective capabilities of using the Ridge Regression. Applications on three different real data sets are also considered to support the theoretical findings presented in the simulation study.
References
Bates, D.M. and Watts, D.G (1988). Nonlinear Regression Analysis and Its Applications. John Wiley and Sons.
Development, R. Core Team, R: A Language and environment for Statistical computing, R Foundation for statistical computing Vienna, ISBN 3-900051-07-0, 2015.
Hampel, E. M. Ronchetti, P. J. Rousseeuw, and W. A. Stahel. Robust Statistics, The Approach Based on Influence Functions.Wiley, New York, 1986.
Hoerl, A. E. and Kennard, R. W. (1970). Ridge regression: biased rstimation for nonorthogonal problems. Technometrics, 12, pp. 55{67}.
Hoerl, A. E. et al, (1975) Ridge Regression: some simulations, Comm Stat Theory Method 4:105.
Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge regression: some simulations. Communications in Statistics, 4, pp.105{123}.
Huber P. J. Robust Statistics. Wiley, New York, 1981.
Kotz, S. and Nadarajah, S. (2004). Multivariate t Distributions and Their Applications. Cambridge University Press.
Marquardt, D. W., & Snee, R. D. (1975). Ridge regression in practice. The American Statistician, 29(1), 3-20.
Smith, G., & Campbell, F. (1980). A critique of some ridge regression methods. Journal of the American Statistical Association, 75(369), 74-81.
Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, Series B, 58, pp. 267{288}.
Weisberg, S. (2005). Applied Linear Regression. 3rd edition. Wiley and Sons, Inc.
Wood, S. N. (2006). Generalized Additive Models: An Introduction with R. Chap-man and Hall.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 International Journal of Sciences: Basic and Applied Research (IJSBAR)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Authors who submit papers with this journal agree to the following terms.