The school research lead and PDSA cycles - what's the evidence .

One of the challenges in commentating about evidence-based practice, particularly if you make suggestions as to how to tackle a particular issue, is making sure you have ‘research evidence’ sitting behind whatever advice you may be giving.  In my recently published book – Evidence-Based School Leadership: A practical guide – I suggest that when acting on evidence – it makes sense to use a succession of Plan-Do-Study-Act (PDSA) cycles.  As such,  I was delighted when I came across some research by  (Tichnor-Wagner et al., 2017) which examines how educators in the US  responded to the use of PDSA cycles.  So the rest of the post will:

·      Briefly describe the characteristics of a PDSA cycle.

·      Review Tichnor-Wagner et al’s research on PDSA cycles.

·      Consider the implications for evidence-based practitioners – of whatever level – within schools

The PDSA Cycle

PDSA cycles have their origins in both the quality assurance and improvement science- (Deming, 1993)  (Langley et al., 2009) and (Bryk et al., 2015).  Put simply, a PDSA Cycle is a tool for planning, implementing, refining and improving an intervention or change, and is designed to help you answer three questions:

  • What are you trying to accomplish?

  • How will we know whether the change is an improvement?

  • What changes can we make that will result in improvement? (Langley et al., 2009)

There are four steps which are designed to be carried out repeatedly to help answer new questions as the intervention unfolds and develops:

·      Plan a small intervention— or small test of change—to learn, making predictions about·  the outcome of the intervention; 

·      Do - doing or executing it in practice - Implementing the change as planned. Collect data and document problems alongside unexpected observations. You also begin analysing the data in this stage.

·      Study - complete analysis of data and compare it with the predictions and expected outcomes. What are the lessons? Were there any unintended consequences, surprises, successes or failures

·      Act – reflecting and acting upon what was learnt in the first three phases

Screen Shot 2019-01-26 at 20.34.54.png

With this process being repeated as many times as required.

Tichnor-Wagner et al’s research on PDSA cycles

Tichnor-Wagner er al drew upon a multi-year research project and involved a comparative case study of innovation design team in two large, urban school districts engaged in improvement work In the study innovation design teams were introduced to PDSA cycles to help them; first, further develop, refine, implement, and scale the designed components of an innovation, and second, to build the capacity of schools and the district to engage in continuous improvement for innovations they might implemented in the future.  Data was collected through the use of semi- interviews with members of the innovation design teams who participated in PDSA training and implementation, surveys of participants after training as well as  documents and field notes from observation of those trainings.  Data analysis involved a first round of descriptive coding that captured examples of the local context  and learning opportunities.  From the first round of descriptive coding a subset of codes emerged , which were then used for a second round of coding. Processes were put in place in ensure the inter-reliability of the coding undertaken by researchers – 90% reliability.  Subsequent, in depth-memos were produced for each of the school districts.  Addtional analysis then examined themes related to the will and capacity of innovation design team members to implement PDSA 


In both districts, participants’ perceptions of PDSA revealed that school and district practitioners saw value in PDSA. 

The PDSA cycles built on work that participants were already doing, suggesting that PDSA may be an incremental rather than radical change to current school practices.

However, although participants thought  PDSAs were similar to what they already do, they  also felt the activity disconnected from their daily work.

Practitioners valued both the PDSA and the innovations they were testing through PDSA, they resisted the specific (researcher) forms they had to fill out for each phase and the scheduling of when the cycles would take place, which caused frustration about PDSA

There  were problems in finding time and expertise for PDSA work— which indicates that additional resources may need to be made available to support innovation and development. 

Implications for evidence-based practitioners

Drawing on Tichnor-Wagner discussion of the findings – the following may be useful for you to consider when using PDSA

There is a lot to be said for implementing intervention, which practitioners see value in and this will contribute to the motivation to engage and use the intervention.  With that in mind, I’d recommend having a look at the work of (Michie et al., 2011) and the role of motivation in their Behaviour Change Wheel.

Even though the PDSA cycle or version of it may be familiar to colleagues within your school, don’t’ assume there is the capacity or capability to make best use of the innovation.  This suggests, that you need to ensure you have a sense of colleagues’ current levels of expertise, prior to implementation.

If you are trying to get colleagues to PDSA cycles, try and make sure they replace some other activity, and become of day to day work of the school, rather than something that is ‘bolted on’.   This also applies to anyone providing training on the use of PDSAs

If you decide to use  disciplined inquiry incorporating PDSAa - to replace some element of your existing performance management (PM) processes, the problems with PM won’t disappear, they’ll in all likelihood just change.

You might want to give consideration as to whether PDSA cycle’s are applicable to all forms of intervention within your school.  For example, how useful is the PDSA cycle if you are looking to develop higher levels of trust within your school.

And finally 

It’s always worth keeping your eye out for research that makes you think about things, you take for granted or think are relatively obvious.  Even relatively simple ideas may need high levels of support to be implemented well.   


Bryk AS, Gomez LM, Grunow A, et al. (2015) Learning to improve: How America's schools can get better at getting better, Cambirdge, MA: Harvard Education Press.

Deming W. (1993) The new economics for industry, education, government. Cambridge, MA: Massachusetts Institute of Technology. Center for Advanced Engineering Study.

Langley GJ, Moen R, Nolan KM, et al. (2009) The improvement guide: a practical approach to enhancing organizational performance, San Francisco: John Wiley & Sons.

Michie S, Van Stralen MM and West R. (2011) The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implementation Science 6: 42.

Tichnor-Wagner A, Wachen J, Cannata M, et al. (2017) Continuous improvement in the public school context: Understanding how educators respond to plan–do–study–act cycles. Journal of educational change 18: 465-494.