Comparing Performances of Logistic Regression, Classification & Regression Trees and Artificial Neural Networks for Predicting Albuminuria in Type 2 Diabetes Mellitus
Keywords:
Albuminuria, Artificial Neural Networks, Classification and Regression Trees, Logistic RegressionAbstract
In this study, performances of classification methods were compared in order to predict the presence of albuminuria in type 2 diabetes mellitus patients. A retrospective analysis was performed in 266 subjects. We compared performances of logistic regression (LR), classification and regression trees (C&RT) and two artificial neural networks algorithms. Predictor variables were gender, urine creatinine, weight, blood urea, serum albumin, age, creatinine clearance, fasting plasma glucose, post-prandial plasma glucose, and HbA1c. For validation set, the best classification accuracy (84.85%), sensitivity (68.0%) and the highest Youden index (0.63) was found in the MLP model but the specificity was 95.12%. Additionally, the specificity of all the models was close to each other. For whole data set the results were found as 84.21%, 53.95%, 0.50 and 96.32% respectively. Consequently, the model had the highest predictive capability to predict the presence of albuminuria was MLP. According to this model, blood urea and serum albumin were the most important variables for predicting the albuminuria. On the basis of these considerations, we suggest that data should be better explored and processed by high performance modeling methods. Researchers should avoid assessment of data by using only one method in future studies focusing on albuminuria in type 2 diabetes mellitus patients or any other clinical condition.
References
K.G.M.M. Alberti, P. Zimmet, J. Straw.
National Institute for Health and Clinical Excellence (NICE) public health guidance 35. Preventing Type 2 Diabetes: Population and Community Level Interventions, pp. 1-91, 2011.
D. de Zeeuw.
American Diabetes Association.
M. Unubol, M. Ayhan, E. Guney.
J. Maroco, D. Silva, M. Guerreiro, A. de Mendon
A. Endo, T. Shibata, H. Tanaka.
M. Ture, I Kurt, A.T. Kurum, K. Ozdamar.
A. Morteza, M. Nakhjavani, F. Asgarani, F.L.F Carvalho, R. Karimi, A. Esteghamati.
X. Meng, Y. Huang, D. Rao, Q. Zhang, Q. Liu.
M. Ture, Z. Akturk, I. Kurt, N. Dagdeviren.
I. Kurt, M. Ture, A.T. Kurum.
D.W. Hosmer, S. Lemeshow. Applied Logistic Regression, New York: Wiley, 2000.
L. Breiman, J.H. Friedman, R.A. Olshen, C.J. Stone. Classification and Regression Trees. New York: Chapman and Hall/CRC, 1984, pp. 93-126.
S. Haykin. Neural Network: A Comprehensive Foundation, Upper Saddle River, NJ: Prentice Hall, 1999.
B. Kr
G. Calcagno, A. Staiano, G. Fortunato, V. Brescia-Morra, E. Salvatore, R. Liguori, et al.
P. Hennig, M. Kiefel.
A.G. Bors.
M.H. Hassoun. Fundamentals of Artificial Neural Networks, MIT Press, Cambridge, 1995.
J. Principe, N.R. Euliano, W.C Lefebvre. Neural and adaptive systems: Fundamentals through simulations. New York: Wiley, 1999.
P.E. de Jong, R.T. Gansevoort, S.J. Bakker.
D.E. Busby, G.L. Bakris.
E. Guney, M. Unubol, V. Yazak, I. Kurt Omurlu.
D. Delen, A. Oztekin, Z.J. Kong.
A.O. Oludolapo, A.A. Jimoh, P.A. Kholopane. Comparing performance of MLP and RBF neural network models for predicting South Africa
M. Garc
B.V. Ramana, M.S.P. Babu, N.B. Venkateswarlu.
B.V. Ramana, M.S.P. Babu.
H. Yasin, A.T. Jilani, M. Danish.
K. Polat, S. Gunes, A. Arslan.
R. Simon, D.M. Radmacher, K. Dobbin, M.L. McShane.
D.G. Kleinbaum. Logistic regression: A self-learning text. Springer-Verlag, New York, 1994.
Downloads
Published
How to Cite
Issue
Section
License
Authors who submit papers with this journal agree to the following terms.