COMPARATIVE STUDY OF BAYESIAN AND ORDINARY LEAST SQUARES APPROACHES

  • Rotimi Kayode Ogundeji Department of Mathematics, University of Lagos, Lagos, Nigeria.
  • Dr. Josephine Onyeka-Ubaka Department of Mathematics, Faculty of Science, University of Lagos, Akoka, Nigeria.
  • Dr. Josephine Onyeka-Ubaka Department of Mathematics, Faculty of Science, University of Lagos, Akoka, Nigeria.
  • Emmanuel Yinusa
Keywords: Frequentist, Regression Models, Assumptions Violation, Diagnostics Tools, Bayesian Methods

Abstract

Frequentist (Classical) and Bayesian are two major approaches to data analysis in statistics, however, the difference is how both see a parameter. Frequentists see a parameter as constant value while the Bayesians see it a random variable. Research work recently has witnessed increase in the application of Bayesian methods to statistical problems and in other fields. For linear regression modelling, frequentists use more often the Ordinary Least Squares (OLS) method despite violation of some assumptions. Bayesian approach can be used when assumptions in linear regression model using OLS are not met. Using two different data sets, an empirical study was performed using both OLS and Bayesian approaches to linear regression modelling. The analysis showed that the resulting linear regression model using OLS does not meet all required assumptions for a good model. The Bayesian approach as an alternative to regression modelling was further established based on results using several criteria such as RMSE, MAPE and MAD. The results showed that linear regression modelling using Bayesian approach is better than Frequentist method using OLS regression modelling.

References

Abdul, R., & Nazia, N. (2017). Effects of Mergers on Corporate Performance: An Empirical Evaluation Using OLS and the Empirical Bayesian Methods. Borsa _Istanbul Review, 17(1), pp. 10-24.

Bonfatti, V., Tiezzi, F., Miglior, F., & Carnier, P. (2017). Comparison of Bayesian Regression Models and Partial Least Squares Regression for the Development of Infrared Prediction Equations. American Dairy Science Association, 100(9), pp. 7306-7319.

Casella, G., & George, E. (1992). Explaining the Gibbs sampler. American Statistician, 46(3), pp. 167-174.

Chelsea, M., Zita, O., & Jonah, G. (2018). User-friendly Bayesian regression modeling: A tutorial with rstanarm and shinystan. The Quantitative Methods for Psychology, 14(2), pp. 99 - 119.

Ching, W. K., & Michael, K. N. (2006). Markov Chains: Models, Algorithms and Applications. Springer.

Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2013). Applied multiple regression/correlation analysis for the behavioral sciences. Routeledge.

Cortez, P., Cerdeira, A., Almedia, F., Matos, T., & Reis, J. (2009). Modeling Wine Preferences By Data Mining

from Physicochemical Properties. In Decision Support Systems, 47(4), pp. 547-553.

Elam, T. W., Scruggs, B., Eggert, F., & Nicolosi, J. A. (2011). Advantages and Disadvantages of Bayesian Methods for Obtaining Xrf Net Intensities. International Centre for Diffraction, 25(2), pp. 1097-2002.

Gelman, A., Carlin, J., Stern, H., & Rubin, D. (2003). Bayesian Data Analysis. Boca Raton: Chapman and Hall/CRC.

Gilks, W. R., Richardson, S., & Spiegelhalter, D. J. (1996). Markov Chain Monte Carlo In Practice. Boca Raton: Chapman & Hall/CRC.

Johnson, V. E., & Albert, H. E. (1999). Ordinal Data Modeling. New York: Springer-Verlag.

Lynch, S. M. (2007). Introduction to Applied Bayesian Statistics and Estimation for Social Scientists. Springer.

Marin, J. and Robert, C. P. (2014). Time Series Analysis with Applications in R. Springer Texts in Statistics,

Springer Science Business Media, LLC.

Martin, A. D., & Quinn, K. M. (2002). Dynamic Ideal Point Estimation via Markov chain Monte Carlo for the U.S. Supreme Court. Political Analysis, 1953-1999, pp. 134–153.

McElreath, R. (2016). Statistical Rethinking: A Bayesian Course with Examples in R and Stan. Boca Raton FL: Chapman and Hall/CRC Press.

Muthen, L. K., & Muthen, B. O. (2008). Mplus (version 5.1). Los Angeles, CA: Muthen and Muthen.

Pena, E. A., & Slate, E. H. (2006). Global Validation of Linear Model Assumptions. Jounal of America Statistics Association, 101(473), pp. 341-354.

Plummer, M. (2003). JAGS: A Program for Analysis of Bayesian Graphical Models Using. In K. Hornik, F. Leisch, & A. Zeileis (Ed.), Proceedings of the 3rd International Workshop on Distributed Statistical Computing, Austria. Vienna. Retrieved from http://www.ci.tuwien.ac.at/Conferences/DSC-2003/Proceedings/

Plummer, M. (2005). JAGS: Just Another Gibbs Sampler, Version 0.8. Retrieved from http://www-fis.iarc.fr/~martyn/software/jags/

Soner, C. G., Levent, T. K., Yalcin, T., & Mustapha, A. (2006). A Comparative Study of Estimation Methods for Parameters in Multiple Linear regression Model. Journal of Applied Animal Research, 29(1), pp. 43-47.

Spiegelhalter, D. J., Thomas, A., Best, N. G., & Lunn, D. (2003). WinBUGS Version 1.4 User Manual.

Cambridge: MRC Biostatistics Unit. Retrieved from http://www.mrc-bsu.cam.ac.uk/bugs/

Syarifah, D. P., & Heruna, T. (2018). Linear Regression Model Using Bayesian Approach for Energy Performance of Residential Building. Procedia Computer Science, 135, pp. 671-677.

Thomas, A., O'Hara, B., Ligges, U., & Sturtz, S. (2006). Making BUGS Open. R News, 12-17. Retrieved from http://CRAN.R-project.org/doc/Rnews/

Published
2023-04-13
How to Cite
Ogundeji, R. K., Onyeka-Ubaka, J., Onyeka-Ubaka, J., & Yinusa, E. (2023). COMPARATIVE STUDY OF BAYESIAN AND ORDINARY LEAST SQUARES APPROACHES. Unilag Journal of Mathematics and Applications, 2(1), 60 - 73. Retrieved from http://lagjma.unilag.edu.ng/article/view/1332
Section
Articles