Evidence-based school leadership and management: A practical guide

View Original

The school research lead: meta-analysis, effect sizes and the leaking ship

Over the last few weeks I have given considerable thought to the usefulness to school leaders of both effect sizes and meta-analyses, as they try to bring about improvement in their schools.  On the one hand, there is the view at effect sizes and meta-analysis have major limitations but remain the best we have, Coe (2018).  On the other hand, there is the view that effect sizes and meta-analysis are fundamentally wrong and do not represent the best that we have and there are viable alternatives, Simpson (2017) and Simpson (2018).

So what is the evidence-based school leader/teacher to do when the scientific literature they have at their disposal is possible either limited or just plain wrong.  Well a useful starting point is Otto Von Neurathe’s  simile comparing scientists with sailors on a rotting ship - We are like sailors who on the open sea must reconstruct their ship but are never able to start afresh from the bottom. Where a beam is taken away a new one must at once be put there, and for this the rest of the ship is used as support. In this way, by using the old beams and driftwood the ship can be shaped entirely anew, but only by gradual reconstruction.  This would suggest that if you think effect sizes and meta-analysis may be leaking planks on the goodship ‘evidence-based education’ – but remain the best we have - it would be foolish to rip them out before we have something to put in their place.  Alternatively, if you are of the view that effect sizes and meta-analysis are ‘wrong’ and that you cannot draw useful conclusions from them – if you keep these ‘leaky planks’  in place – this may well lead to the ‘ship’ taking on water and becoming unstable, going off in the wrong direction or even sinking.   If that is the case, then whoever is steering the ship needs to do a least three things. First, make adjustments in the direction the ship may ‘naturally’ travel by making appropriate changes in-course.  Second, redouble their efforts to find ‘new planks’ which can be used as replacements for the leaky planks. Three, find other materials which can help plug the leaks, whilst they look for new planks.

However, from the point of view of the evidence-based school leader, it probably does not matter which stance you take on effect sizes and meta-analysis, the actions you need to undertake the address the issues at hand will in large part remain the same.   First, as Kvernbekk (2016) states when looking at research studies it’s the causal claim and associated support factors that your are after – not the average effect size.  Properly conducted randomised controlled trials, which include an appropriate impact evaluation or qualitative element – may give you several clues as to what you need to do to make an intervention work in your setting.

Second, spend time making sure you are solving the right problem.  Solving the right problem will have a major impact on whether or not you are successful in bringing about favourable outcomes for pupils.  There is no point using high quality and trustworthy research studies, if they are being used to help you solve the wrong problem.  On the other hand, if they are helping you get a better understanding of the issues at hand, then that’s another matter.

Third, as an evidence-based school leaders you won’t just rely on the academic and scientific literature, you will also draw upon a range of different sources of evidence – practitioner expertise, stakeholder views and school/organisational data to help you come up with a solution which leads to a favourable outcome.  That does not mean that these other sources of evidence are without their own problems.  Nevertheless, in making a decision it might be better to use these sources of evidence –  and being aware of their limitations -  than not to use them at all, Barends and Rosseau (2018).

Four, given the ‘softness’ of the evidence available – even if you come up with a plausible solution, you will need to give a great deal of thought to the scale of implementation.  In all likelihood, small fast-moving iterative pilot studies within your school are more likely to lead to long-term success than school or multi-academy trust wide rollouts. Langley, Moen, et al. (2009) and Bryk, Gomez, et al. (2015)  provide useful guidance as to what to do given different levels of knowledge, resources and stakeholder commitment.

Five, as Pawson (2013) states,  it is important to attend extremely closely to the ‘quality of the reasoning in research reports rather than look only to the quality of the data.’ (p11)   Moreover, it is necessary to give thought and effort considering how you go about improving the quality of your own practical reasoning.  This could be done by making sure that before you make ‘evidence-based decisions’ you make sure your thinking is tested by individuals – who may well disagree with you.  You may also want to look at the work of Jenicek and Hitchcock (2005) who provide guidance on the nature of critical thinking and strategies that you can adopt to improve your own thinking skills.

And finally

This discussion should not be seen as an attempt to dismiss the usefulness of evidence-based practice.  Rather it should be seen as an attempt to outline what to do when the research evidence is a bit ‘squishy’.  Even if it wasn’t ‘squishy’ effect sizes and systematic reviews would only provide you with a small fraction of the evidence you need to make concerning decisions about what educational interventions to adopt or withdraw within your school, Kvernbekk (2016).

 References

Barends, E. and Rosseau, D. (2018). Evidence-Based Management: How to Use Evidence to Make Better Organizational Decisions. London. Kogan-Page.

Bryk, A. S., Gomez, L. M., Grunow, A. and LeMahieu, P. G. (2015). Learning to Improve: How America's Schools Can Get Better at Getting Better. Cambirdge, MA. Harvard Education Press.

Coe, R. (2018). What Should We Do About Meta-Analysis and Effect Size? CEM Blog. http://www.cem.org/blog/what-should-we-do-about-meta-analysis-and-effect-size/. 5 December, 2018.

Jenicek, M. and Hitchcock, D. (2005). Evidence-Based Practice: Logic and Critical Thinking in Medicine. United States of America. American Medical Association Press.

Jones, G. (2018). The Ongoing Debate About the Usefulness of Effect Sizes. GaryRJones.com. https://www.garyrjones.com/blog/.

Kvernbekk, T. (2016). Evidence-Based Practice in Education: Functions of Evidence and Causal Presuppositions. London. Routledge.

Langley, G. J., Moen, R., Nolan, K. M., Nolan, T. W., Norman, C. L. and Provost, L. P. (2009). The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. San Francisco. John Wiley & Sons.

Pawson, R. (2013). The Science of Evaluation London. Sage Publications.

Popper, K. (1992). The Logic of Scientific Discovery (5th Edition). London. Routledge.

Simpson, A. (2017). The Misdirection of Public Policy : Comparing and Combining Standardised Effect Sizes. Journal of Education Policy. 32. 4. 450-466.

Simpson, A. (2018). Princesses Are Bigger Than Elephants: Effect Size as a Category Error in Evidence‐Based Education. British Educational Research Journal. 44. 5. 897-913.