Evidence-based school leadership and management: A practical guide

View Original

Where's the evidence for evidence-based practice improving pupil outcomes?

 A few weeks ago in an online  discussion with Dr David James – Deputy Head Academic at Bryanston School –  David posed the  following question: Where is the evidence that evidence-based practice has a measurable impact on learning and outcomes? In other words, which schools can point to exam results and say they have improved because of evidence-informed practice? In other words, where is the backing for the claim that schools and teachers should use evidence – and particularly research evidence – to inform practice?  As otherwise, all we have is the assertion that the use of evidence is a good thing.

Unfortunately, at the moment, there is relatively little, if any, evidence that the use of research evidence by teachers will improve pupil outcomes,  Rose, Thomas, et al. (2017).  However, this may change with the forthcoming EEF evaluation of the RISE project, which was run out of Huntington School and which is due to be published in early 2019.  Indeed, where evidence is available about the outcome of teacher use of research evidence it relates to the positive impact it has on teachers,  Cordingley (2015) and  Supovitz (2015).  So it is within this context, that I read with an interest a recently published systematic review - Simons, Zurynski, et al. (2018) – on whether evidence-based medicine training improve doctors knowledge, practice and patient outcomes, which concludes: EBM training can improve short-term knowledge and skills among medical practitioners, although the evidence supporting this claim comes from relatively poor-quality evidence with a significant risk of bias. There is even less evidence supporting the claim that EBM training results in changes in clinicians’ attitudes and behavior, and no evidence to suggest that EBM training results in improved patient outcomes or experiences. There is a need to improve the quality of studies investigating the effectiveness of EBM training programs. (p5)

 Now, if you are an advocate of evidence-based education this may appear to be quite depressing.  If medicine, where evidence-based practice first originated, has not been able to provide evidence-based medicine training to doctors which improves patient outcomes – then what chance do we have in education of being able to train teachers and leaders to use evidence-based practice to improve pupil outcomes?  Well my own view is that we may be able to learn lessons from evidence-based medicine, which will then help us create the conditions for success within education.  That does not mean this is a given, we need to learn the right lessons, adapt them in the right way for education and then implement them in such a way which allows significant adaptation within a local context.  So to start this process, I am going to take the practice points identified by Simons, et al. (2018) and comment on their potential applicability within education and their implications they may have different ‘players’ within education based education eco-system.

The Practice Points from the systematic review

The EBM practice landscape is changing with more emphasis on patient participation, including shared decision-making.

Most doctors benefit from EBM training that is integrated into their clinical practice, and where institutional support is evident.

Whilst EBM courses for doctors demonstrate short-term improvements in knowledge, there is no strong evidence linking EBM training to changes in clinical practice or patient outcomes.

It is important to investigate whether EBM training leads to improvements in doctors’ practice behaviors that may also facilitate changes in patient outcomes and experiences.

It may be possible to use reliable measures of clinical practice and patient experiences to evaluate EBM training, such as structured practice portfolios, patient experience surveys and multi source feedback. (p1)

Implications and discussion

First, given the challenges that medicine appears to be having in getting training for evidence-based medicine to work with doctors – then maybe we should not be too surprised if our first efforts to provide training for teachers in evidence-based practice do not lead to improvements in pupil outcomes.  This in turn may require us, in the short-term to reduce our expectations about the what the training of teachers  and leaders to use evidence-based practice can achieve. 

Second, given the changes in the evidence-based medicine landscape and the increased focus on patient participation and informed decision-making – all those involved in evidence-based practice within schools, may need to give consideration to the role of pupils, teachers, parents and other stakeholders in evidence-based decision-making.

Third, training designed to support the use of evidence-based practice within schools will need to be sustained.  It’s highly unlikely that training provided in ITT or professional learning is going to ‘deliver’ evidence-based practice within schools.  Rather it is going to require an ongoing and sustained effort and cannot be just a short-term fad or this year’s priority.  This is particularly important when considering the both impact of the EEF/IEE Research Schools programme and its future development, as it maybe the underpinning model needs radical re-modelling. 

Four, if you are school leader and want to encourage evidence-based practice within your school – then you need to make sure sufficient support is in place to build the capacity, motivation and opportunities necessary for evidence-based practice,  Langer, Tripnet, et al. (2016). 

Five, given that EEF evaluations of interventions include both process and impact evaluations it may be that medicine has much to learn from education about evaluations using multiple sources of evidence.  On the other hand, Connolly, Keenan, et al. (2018) report that over 60% of randomised controlled trails within education tended to ignore both the context within which the intervention took place and the experience of participants.

And finally

It’s important to remember that when trying to evaluate the impact of any intervention on examination results that around 97% of the variation in performance between year groups can be explained by changes in the cohort and how well do individuals do ‘on the day,’ Crawford and Benton (2017).  So the impact of examination results from teachers being trained in evidence-based practice is likely to be relatively small.  Indeed, it’s not enough to look at one year’s examination results, results will be need to reviewed and evaluated over a number of years.

References

Connolly, P., Keenan, C. and Urbanska, K. (2018). The Trials of Evidence-Based Practice in Education: A Systematic Review of Randomised Controlled Trials in Education Research 1980–2016. Educational Research.

Cordingley, P. (2015). The Contribution of Research to Teachers’ Professional Learning and Development. Oxford Review of Education. 41. 2. 234-252.

Crawford, C. and Benton, T. (2017). Volatility Happens: Understanding Variation in Schools’ Gcse Results : Cambridge Assessment Research Report. Cambridge, UK.

Langer, L., Tripnet, J. and Gough, D. (2016). The Science of Using Science : Researching the Use of Research Evidence in Decision-Making. London. EPPI Centre, S. S. R. U., UCL Insitute of Educations, University College of London.

Rose, J., Thomas, S., Zhang, L., Edwards, A., Augero, A. and Roney, P. (2017). Research Learning Communities : Evaluation Report and Executive Summary December 2017. London.

Simons, M. R., Zurynski, Y., Cullis, J., Morgan, M. K. and Davidson, A. S. (2018). Does Evidence-Based Medicine Training Improve Doctors’ Knowledge, Practice and Patient Outcomes? A Systematic Review of the Evidence. Medical teacher. 1-7.

Supovitz, J. (2015). Teacher Data Use for Improving Teaching and Learning. In Brown, C.  Leading the Use of Research & Evidence in Schools.  London. Bloomsbury Press.