Evidence-based practice and instructional coaching - why the research evidence is not enough.

Just this week, @DrSamSims wrote a very well-argued blogpost giving four reasons why  instructional coaching is the most well-evidenced form of CPD,  and which concludes that All schools that aspire to be evidence-based should be giving it (instructional coaching ) a go.  Unfortunately - Sam’s last sentence is a tad over-enthusiastic and for three reasons mars what would be an excellent post.   First, it’s not for researchers to tell school leaders and teachers what they should or should not be doing in their schools and classrooms .  What school leaders and teachers prioritise in their schools and classroom is down to their professional judgment.  Indeed, the last sentence of Sam’s blogpost is grist to the mill for the opponents of evidence-based education.  Second, the role of research evidence in evidence-based practice is to provide the backing for warrants.  As such, research evidence plays an indirect role in developing an arguments supporting the use of the intervention - Kvernbekk (2016).  Third, even if you think that instructional coaching meets an obvious priority for  your school i.e. supporting the improvement of teaching learning – that does not mean your school should automatically do it.  Ideally, school leaders would use a disciplined process to work out whether what worked ‘there’ is going to work ‘here’.

So in the rest of this post I am, once again, going to lean on the work of Kvernbekk (2016)  to examine the process you might wish to undertaken before adopting even the ‘most evidenced’ intervention.

Read More

The school research lead and piling up statistical significance

Last weekend saw the annual education evidence fest – aka- ResearchED  2018 take place in St John’s Wood, London.  Unfortunately, one of the inevitable disappointments of attending #rED 18 is that you are unable to see all the speakers that you would like to see.  As such, you are often have to do with other people’s summaries of speakers.  So I was particularly pleased to see Schools Week come up with a headline and article ResearchED 2018 Five interesting things we learned and was even more pleased when I saw it contained a short summary of Dr Sam Sims presentation on the positive impact of instructional coaching.  However, my excitement was short-lived as when I came to read the article, and especially references to the statistically significant positive effect of instructional coaching being was found in 10 out of 15 studies, which was then used to infer that “probably the best-evidenced form of CPD currently known to mankind”

Read More

The school research lead and understanding evidence-informed practice

The start of this week will have seen most schools have at least one-day of INSET/CPD – call it what you will – to start off the new academic year.

No doubt many colleagues will have played ‘bullsh.t bingo’ – ticking off the number of times terms such as -research, evidence, evidence-informed practice, best-practice, the evidence says – are used by members of the senior leadership team.

Read More

The school research lead and causal cakes

At the start of term there is normally an unusual number  of birthday cakes in school staffrooms, as colleagues who have had birthdays over the summer break bring cakes into school for a belated birthday celebration.  However, if you are school research lead or champion, what you really need to share with colleagues is  something known as a ‘causal cake.’   The concept of a ‘causal cakes’ is particularly useful as it will help you get a better idea of the knowledge needed to help you make reliable predictions as to whether interventions that worked ‘somewhere’ will work ‘here’ in your school. So to help you do this - I’m going to draw upon the work of Cartwright and Hardie (2012) and their work on casual cakes and support factors.

Read More

The school research lead, the three legged stool and how to avoiding falling on your backside

s a school research lead you will have no doubt spent some of your summer reading research articles – be it systematic reviews or randomised controlled trials RCTS - and thinking about whether the interventions you’ve read about will work in your school and setting.  Indeed, you may have been working on a PowerPoint presentation making the case for why some well researched and evidenced teaching intervention which has shown positive outcomes in other schools, should be adopted within your school.  So to help with you with the task of developing an argument - which begins with ‘it worked there’ and which  concludes ‘it’ll work here’ -   I am going to lean on the work of Cartwright and Hardie (2012).  However, before I do that, it’s necessary to: define a number of terms, the understanding of which are central to getting out the most of out of this blogpost; then, clarify the causal claim being made when we say something ‘works’.

Read More