Meta-analysis for Supporting Empirical Theories in Educational Sciences

Authors

DOI:

https://doi.org/10.18778/2450-4491.20.05

Keywords:

evidence-based approach, meta-analysis, theory, systematic literature review

Abstract

This article explores the role of meta-analysis and systematic review in developing and refining empirical theories in educational sciences. It highlights the method’s value in synthesizing research findings, identifying patterns, and improving the explanatory power and coherence of theories. It also underscores the skepticism present in academic circles, especially concerning meta-analysis. While meta-analysis is widely used in evidence-based approaches, its adoption in educational research sometimes remains locally limited due to concerns about data quality, methodological heterogeneity, publication bias, and perceived epistemic incompatibility with constructivist or interpretive paradigms. The author argues that these challenges can be addressed through methodological rigor, data transparency, proper contextualization, and interdisciplinary training in statistics, epistemology, and logic. Meta-analysis is presented not only as a statistical tool, but as a means of supporting intellectual inquiry and collaborative theory-building. The article calls for greater integration of meta-analytic methods into education research, emphasizing their potential to enhance the quality, comparability, and transparency of scientific knowledge.

Author Biography

Sławomir Pasikowski, University of Lodz

Sławomir Pasikowski – Ph.D. in Education Sciences, specializations: the methodology of scientific research in education. Author of publications on the methodology of educational research and the use of statistics in social research. Vice-chair of the Section for Methodology of Research on Education at the Committee on Pedagogical Sciences of the Polish Academy of Sciences (PAN). Member of the Laboratory of Research Tools at the Committee on Pedagogical Sciences of the PAN.

References

Audigier V., White I. R., Jolani S., Debray T. P. A., Quartagno M., Carpenter J., Resche-Rigon M. (2018) Multiple Imputation for Multilevel Data with Continuous and Binary Variables, “Statistical Science”, no. 33, pp. 160–183, https://doi.org/10.1214/18-STS646
Google Scholar DOI: https://doi.org/10.1214/18-STS646

Bartoš F., Maier M., Wagenmakers E., Doucouliagos H., Stanley T. D. (2023) Robust Bayesian Meta‐Analysis: Model‐Averaging Across Complementary Publication Bias Adjustment Methods, Research Synthesis Methods, no. 14(1), pp. 99–116, https://doi.org/10.1002/jrsm.1594
Google Scholar DOI: https://doi.org/10.1002/jrsm.1594

Cumming G., Calin-Jageman R. (2017) Introduction to the New Statistics. Estimation, Open Science, and Beyond, New York, NY – London, Routledge, https://doi.org/10.4324/9781315708607
Google Scholar DOI: https://doi.org/10.4324/9781315708607

Gajda A., Karwowski M., Beghetto R.A. (2017) Creativity and Academic Achievement: A Meta-Analysis, Journal of Educational Psychology, no. 109, pp. 269–299, https://doi.org/10.1037/edu0000133
Google Scholar DOI: https://doi.org/10.1037/edu0000133

Gosling C.J., Solanes A., Fusar-Poli P., Radua J. (2023) Metaumbrella: the First Comprehensive Suite to Perform Data Analysis in Umbrella Reviews with Stratification of the Evidence, BMJ Mental Health, no. 26, pp. 1–8, https://doi.org/10.1136/bmjment-2022-300534
Google Scholar DOI: https://doi.org/10.1136/bmjment-2022-300534

Gough D., Oliver S., Thomas J. (2017) An Introduction to Systematic Reviews, London, Sage.
Google Scholar

Grund S., Ludtke O., Robitzsch A. (2018) Multiple Imputation of Missing Data for Multilevel Models: Simulations and Recommendations, Organizational Research Methods, no. 21, pp. 111–149, https://doi.org/10.1177/1094428117703686
Google Scholar DOI: https://doi.org/10.1177/1094428117703686

Harrer M., Cuijpers P., Furukawa T. A., Ebert D. D. (2022) Doing Meta-Analysis with R. A Hands-On Guide, London – New York, NY, Taylor & Francis Group, https://doi.org/10.1201/9781003107347
Google Scholar DOI: https://doi.org/10.1201/9781003107347

Hattie J. (2008) Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement, London – New York, NY, Routledge.
Google Scholar

Hempel C. G. (1964) Fundamentals of Concept Formation in Empirical Science, Chicago, IL – London, University of Chicago Press.
Google Scholar

Hempel C. G. (1965) Aspects of Scientific Explanation and Other Essays in the Philosophy of Science, New York, NY, Free Press.
Google Scholar

Heyvaert M., Maes B., Onghena P. (2011) Applying Mixed Methods Research at the Synthesis Level: An Overview, Research in the Schools, no. 18(1), pp. 12–24.
Google Scholar

Hong Q. N., Pluye P., Bujold M., Wassef M. (2017) Convergent and Sequential Synthesis Designs: Implications for Conducting and Reporting Systematic Reviews of Qualitative and Quantitative Evidence, Systematic Reviews, no. 6(1):61, https://doi.org/10.1186/s13643-017-0454-2
Google Scholar DOI: https://doi.org/10.1186/s13643-017-0454-2

Ioannidis J. P. A. (2009) Integration of Evidence from Multiple Meta-Analyses: A Primer on Umbrella Reviews, Treatment Networks and Multiple Treatments Meta-Analyses, Canadian Medical Association Journal, no. 181(8), pp. 488–493, https://doi.org/10.1503/cmaj.081086
Google Scholar DOI: https://doi.org/10.1503/cmaj.081086

Lakatos I. (1989) The Methodology of Scientific Research Programmes, Cambridge, Cambridge University Press.
Google Scholar

León S. P., Garrido M. G., Martínez I. G., Lipnevich A. (2025) Passing TPACK through the Metascience Circle: A Meta-analytic Systematic Review of the Application of the Model, May 14, preprint, https://doi.org/10.35542/osf.io/c5qsr_v1
Google Scholar DOI: https://doi.org/10.35542/osf.io/c5qsr_v1

León S. P., Lipnevich A., Garrido M. G. (2025) Is All That Glitters Gold? Meta-analytical Systematic Review of the Effect of the Montessori, April 7, preprint, https://doi.org/10.35542/osf.io/mf9ec_v1
Google Scholar DOI: https://doi.org/10.35542/osf.io/mf9ec_v1

León S. P., Panadero E., García-Martínez I. (2023) How Accurate Are Our Students? A Meta-Analytic Systematic Review on Self-assessment Scoring Accuracy, Educational Psychology Review, no. 35:106, https://doi.org/10.1007/s10648-023-09819-0.
Google Scholar DOI: https://doi.org/10.1007/s10648-023-09819-0

Merton R. K. (2017) Social Theory and Social Structure, New York, NY, Free Press.
Google Scholar

Moher D., Liberati A., Tetzlaff J., Altman D. G., Prisma Group (2009) Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA Statement, PLoS Medicine, no. 6(7), e1000097, https://doi.org/10.1371/journal.pmed.1000097
Google Scholar DOI: https://doi.org/10.1371/journal.pmed.1000097

Page M. J., Higgins J. P. T., Sterne J. A. C. (2019) Assessing Risk of Bias Due to Missing Results in a Synthesis, in: Cochrane Handbook for Systematic Reviews of Interventions, J. P. T. Higgins, J. Thomas, J. Chandler, M. Cumpston, T. Li, M. J. Page, V. A. Welch (eds.), Oxford, The Cochrane Collaboration and John Wiley & Sons, https://doi.org/10.1002/9781119536604.ch13
Google Scholar DOI: https://doi.org/10.1002/9781119536604.ch13

Przełęcki M. (1988) Logika teorii empirycznych, Warszawa, Wydawnictwo Naukowe PWN.
Google Scholar

Pigott T. D., Polanin J. R. (2019) Methodological Guidance Paper: High-Quality Meta-Analysis in a Systematic Review, Review of Educational Research, no. 90(1), pp. 24–46, https://doi.org/10.3102/0034654319877153
Google Scholar DOI: https://doi.org/10.3102/0034654319877153

Popper K. R. (2002) The Logic of Scientific Discovery, London – New York, NY, Routledge.
Google Scholar

Reichenbach H. (1963) The Rise of Scientific Philosophy, Berkeley & Los Angeles, University of California Press.
Google Scholar

Sandelowski M., Barroso J. (2007) Handbook for Synthesizing Qualitative Research, New York, NY, Springer.
Google Scholar

Schmidt F. L. (1992) What Do Data Really Mean? Research Findings, Meta-Analysis, and Cumulative Knowledge in Psychology, American Psychologist, no. 47(10), pp. 1173–1181, https://doi.org/10.1037//0003-066X.47.10.1173
Google Scholar DOI: https://doi.org/10.1037//0003-066X.47.10.1173

Shea B. J., Reeves B. C., Wells G., Thuku M., Hamel C., Moran J., Moher D., Tugwell P., Welch V., Kristjansson E., Henry D. A. (2017) AMSTAR 2: A Critical Appraisal Tool for Systematic Reviews That Include Randomised or Non-Randomised Studies of Healthcare Interventions, or Both, BMJ, 358:j4008, https://doi.org/10.1136/bmj.j4008
Google Scholar DOI: https://doi.org/10.1136/bmj.j4008

Slim K., Marquillier T. (2022) Umbrella Reviews: A New Tool to Synthesize Scientific Evidence in Surgery, Journal of Visceral Surgery, no. 159(2), pp. 144–149, https://doi.org/10.1016/j.jviscsurg.2021.10.001
Google Scholar DOI: https://doi.org/10.1016/j.jviscsurg.2021.10.001

van Fraassen B. C. (2004) The Scientific Image, Oxford, Oxford University Press.
Google Scholar

Vevea J. L., Hedges L. V. (1995) A General Linear Model for Estimating Effect Size in the Presence of Publication Bias, Psychometrika, no. 60, pp. 419–435, https://doi.org/10.1007/BF02294384
Google Scholar DOI: https://doi.org/10.1007/BF02294384

Wiśniewska E., Karwowski M. (2007) Efektywność treningów twórczości – podejście metaanalityczne, Ruch Pedagogiczny, no. 3–4, pp. 31–50.
Google Scholar

Wittgenstein L. (1958) Philosophical Investigations, New York, NY, Macmillan Publishing Co.
Google Scholar

Downloads

Published

2025-07-03

How to Cite

Pasikowski, S. (2025). Meta-analysis for Supporting Empirical Theories in Educational Sciences. Nauki O Wychowaniu. Studia Interdyscyplinarne, 20(1), 45–55. https://doi.org/10.18778/2450-4491.20.05