Promoting data-literacy in schools

Just last week I hosted a #UkEdResChat on how and when to go about supporting teachers develop research literacy.  However, the more I thought about the it seemed to me that maybe we were focussing on the wrong type of ‘literacy’ and that instead we should be discussing and focussing on developing teacher ‘data-literacy’.  For it seems to me that If we want teachers to improve their day to day practice by engaging in some form of professional inquiry then having the skills to effectively interpret data is essential. and far more relevant to classroom teachers than research literacy. So to help develop my thinking on what data-literacy could and should look like, I did a quick search on Google Scholar using the term teacher data literacy and came across Mandinach and Gummer (2016) and their conceptual framework on teacher data literacy (DLTF), and which seems to be worth sharing.  Accordingly, this post will discuss Mandinach and Gummer’s definition of data literacy and associated conceptual framework; the dispositions and habits of mind that influence data use; and a discussion of the implications for school leaders

Data literacy for teaching (DLFT)

Using data is a key component of the teachers’ role.  In England, the 2011 Teacher Standards – updated in 2013 – state that teacher should use ‘relevant data to monitor progress, set targets, and plan subsequent lessons.’  However, as Mandinach and Gummer point out – this type of statement is extremely general and does not provide any guidance about the knowledge and skills necessary to be data literate.  In order to address this issue Madinach and Gummer offer up this definition of data literacy for teaching.

Data literacy for teaching is the ability to transform information into actionable instructional (teaching) knowledge and practices by collecting, analysing, and interpreting all types of data (assessment, school climate, behavioural, snapshot, longitudinal, moment to moments etc) to help determine instructional steps.  It combines an understanding of data with standards, disciplinary knowledge and practices, curricular knowledge, pedagogical content knowledge and an understanding of how children learn (p2)

 Mandinach and Gummer then go onto state three propositions on which their work is grounded.  

1.    There is no question that educators must be armed with data?

2.    Assessment literacy is not the same as data literacy.

3.    Simply relying on professional development to develop teacher data literacy is not enough and a conceptual framework is required.

The Data Literacy For Teachers Framework

The DLTF framework was a product of number activities – including the convening of experts, a review of published materials and resources dealing with data-literacy; a review of relevant teaching standards – which were then synthesised by Mandinach and Gummer with what is viewed as good teaching – and resulted in five domains for data literacy - identify problems and frame questions; use data; transform data into information; transform information into a decision; and, evaluate outcomes.  The associated knowledge and skills associated in with each domain are summarised in Table 1 – more detail can be found in Mandinach and Gummer’s article

Screenshot 2019-04-16 at 11.59.08.png

Dispositions, habits of mind, or factors that influence data

Mandinach and Gummer the go onto identify another category of components which they viewed as necessary for data uses, which they called dispositions or habits of mind and which are linked to general teaching rather than be specific towards data use.  Nevertheless, it is argued that the six dispositions identified influence how teachers approach the use of data.  The six dispositions are:

·     Belief that all students can learn

·     Belief in data/think critically

·     Belief that improvement in education requires a continuous inquiry cycle

·     Ethical uses of data, including the protection of privacy and confidentiality of data

·     Collaborations

·     Communication skills with multiple audience

Implications for school leaders, CPD co-ordinators and school research leads

First, it’s important to remember that this conceptual framework needs to be ‘tested’ in the field and should not be seen as the definite statement of what data literacy looks like. As Mandinach and Gummer acknowledge more research is required into: how data literacy may change over time (just think social media and GDPR); how best to meet the data literacy needs of teachers at different stages of their career; the possible continuum of data literacy ranging from novice to expert what data literacy looks like for teachers at different stages of their careers and what implications this has for support made available to teacher; how data literacy may differ for a newly qualified teacher, compared to say a senior leader in a multi-academy trust?  In other words, this conceptual framework is not the finished article and is a work in progress, and should not be ‘oversold’ to colleagues

Second, what are the implications of the DLTF for schools wising to  incorporate ‘disciplined inquiry’ either as part performance management or continuous professional development?(see https://www.garyrjones.com/blog/2019/3/23/performance-management-and-disciplined-inquiry).  Initial reflection would suggest that if you accept that teachers in a school will have varied levels of data literacy – then it’s unlikely that a single model of disciplined inquiry or type of inquiry question is likely to appropriately meet the development needs of all teachers.  Indeed, before embarking on ‘disciplined inquiry’ it may be wise to undertake some form skills audit – so that you start off where colleagues are rather than where you would them to be.  This can also help you put in the place different packages of support relevant for teachers at different stages of the development of their research literacy.

Third, although there appears to tide appears to be turning against the importance of internal school data , especially in the inspection of schools (https://www.tes.com/news/ofsted-inspections-wont-examine-internal-school-data) that does not mean this framework is any less relevant.  Indeed, what this framework does is emphasise that data literacy is not just about – assessment or progress data– but is far more comprehensive. 

Finally, should attention switch from developing teacher research literacy to the development of data literacy?  Well given the lack of support research evidence which compares and contrast data or research literacy in action in schools, the answer to that question has to be no.  Instead, the two skill sets should be seen as overlapping and complimentary.  For example, for teachers seeking to identify problems and frame questions are more likely to be able to do this if they are research literate and can use research to better understand an issue.  On the other hand, a research literate teacher who subsequently makes a ‘research informed’ change in practice will need to have the capability to evaluate the outcomes of that change in practice

And finally

If you are interested in becoming more data literate, then I recommend that you have a look at Richard Selfridge’s book - Databusting for Schools: How to use and interpret education data. 

Reference

DfE (2013) Teachers’ Standards: Guidance for school leaders, school staff and governing bodies, July 2011 (introduction updated June 2013). London: DfE

Mandinach, E. B., & Gummer, E. S., What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions, Teaching and Teacher Education (2016), http://dx.doi.org/10.1016/j.tate.2016.07.011

Selfridge, R (2018) Databusting for Schools. How to use and interpret education data. London, Sage Publishing

Promoting inquiry-based working within your school

Just a few weeks ago I gave a presentation at ResearchED Blackpool which I explored the nature of disciplined inquiry and its role in performance management. I also examined whether there was any support for the claims being made about how inquiry contribute to teacher professionalism and school improvement.  So with that in mind, I was delighted to have come across the work of Uiterwijk-Luijk, L., Krüger, M., & Volman, M. (2019) examining the promotion of inquiry-based working and the inter-relationships between school boards, school leaders and teachers, as little appears to be known about how schools can create a culture of inquiry.  Furthermore, given the Education Endowment Foundation’s efforts to support governing bodies/trusts to become more evidence-informed, the publication of Uitwerwilk-Luik et al’s work is extremely timely. 

Promoting inquiry-based working 

Working with three Dutch primary schools, Uiterwilk-Luik et al set out to answer the following research question: How can the interplay between school boards, school leaders and teachers regarding inquiry-based working be characterised?   In addition, there were two subquestions. One, how can the interplay between school boards and school leaders regarding inquiry-based leadership be characterised? Two, how can the interplay between school leaders and teachers regarding respectively inquiry-based leadership and inquiry- based working be characterised. 

A multiple case-study design was adopted, with data being collected through interviews with school boards, school leaders and teachers.  Meetings were also observed, along with documentary analysis. Data was transcribed and coded to a coding scheme using MAXQDA coding software.  The coding scheme that emerged from the interviews influenced the analysis of the data from observations and documents.  Subsequently a cross-case analysis was undertaken along with in-case analysis.

The results of the analysis suggest that both schools’ boards and school leaders adopted four strategies to promote inquiry-based working, with each of these strategies incorporating a range of sub-approaches, and which are listed below

 Stimulating school leaders and teacher’s inquiry habits of mind by

  • Discussing students results with school leaders/teachers

  • Encouraging leaders to discuss students results with teachers and teachers discussing results with each other

  • Sharing knowledge

  • Modelling behaviour

  • Making demands 

  • Having high expectations

  • Encouraging leaders to co-operates and discuss research results with school leaders from other schools 

Stimulate leaders and teacher data literacy by

  • Involving external organisations to support school leaders conduct research

  • Developing internal expertise to support inquiry

Communicate the visions for inquiry-based working

  • Communicating orally about the vision for inquiry-based working

Share leadership

  • Sharing leadership with responsibilities with teachers

Support inquiry-based working by 

  • Providing money, time and space

  • Trusting and believing 

  • Belling open to new ideas and concerning research

  • Creating a safe environment

The interplay between the school board and school leaders 

It is worth noting that although these different approaches were seen in more than one school, the accomplishment and impact of the approaches differed.  Uiterwilk-Luik et al note that in the school where there was a focus on student results, the demand made by the school board led to ‘inquiry’ being experiences ‘as part of a performativity agenda.’  Whereas in the other two schools, the boards’ demands re inquiry working were seen as part of an attempt to raise educational quality.  In addition, it was noted than none of the schools had a clear written down vision for inquiry-based working and w

The interplay between school leaders and teachers

Interestingly all three schools adopted the same approach to stimulating inquiry, by providing time, money and space; being open to new ideas concerning research; by creating a safe environment.  That said, there were differences between the schools.  One principal talked about an implicit rule that all decisions be based on data, a rule which was not recognised by the teachers in the school.  Furthermore, in two of the schools – teachers demonstrated the inquiry culture by being critical i.e. asking critical questions – and modelling behaviour – by investigating and improving their own actions by comparing them with the work of others.

Implications for schools in England wishing to promote an inquiry-based culture.

Although the research was conducted in another education systems, and where the relationships between governing bodies, school leaders and teachers are different.  Nevertheless, the research does prompt a number of questions – which governing bodies/trusts, school leaders and teachers, might wish to reflect on within the context of their own schools/multi-academy trusts.

  • To what extent are the various strategies and approaches adopted by school boards and school leaders seen in your context?

  • Is there a gap between the rhetoric and reality of inquiry-based working within your school?

  • To what extent is the data on which decisions are based being made explicit?

  • How are school governing bodies/trusts supporting the use of inquiry-based methods?

  • Is inquiry-working within the school perceived as a genuine attempt to bring about educational improvement or is it viewed as part of a performativity agenda?

  • Does your school have an ‘inquiry-based working’ vision and mission statement?

  • How would you describe ‘inquiry-based working with your school – bottom-up, top-down or multi-directional?

If your school is engaged in inquiry-based working, do there appear to be any negative consequences?

And finally

As a consequence of writing this blogpost, it has made me realise that I have been paying insufficient attention to what school leader or teacher data-literacy looks like.  If colleagues are going to be encouraged to undertake some form of disciplined inquiry within their schools, then what knowledge and skills do they need to have so that they can draw meaningful and well-informed conclusions from their work.   

References

Uiterwijk-Luijk, L., Krüger, M., & Volman, M. (2019). Promoting inquiry-based working: Exploring the interplay between school boards, school leaders and teachers. Educational Management Administration & Leadership, 47(3), 475–497. https://doi.org/10.1177/1741143217739357

All the school research lead needs is a COM-B to bring about behaviour change

Despite the increased interest in the use of evidence-informed practice within schools, research suggests that the majority of teachers are not using research to inform their teaching practice. Research from the Sutton Trust indicates that a minority of teachers (45%) are using research to inform their teaching, with only a small minority of teachers (23%) using  the Education Endowment Foundation’s Teaching and Learning Toolkit.  Morever, other research undertaken by the NFER - Nelson, Mehta, et al. (2017)- suggests that research evidence is not playing a major role in teachers’ decision-making when developing their classroom practice, relative to other sources.  

So given the potential magnitude of the challenge, it makes sense to look at the science of behaviour change  - and the work of Michie, Van Stralen, et al. (2011)and Michie, Atkins, et al. (2016) and their Capability-Opportunity-Motivation Behaviour( COM-B)  model of behaviour change   - to see what help it can provide evidence-informed practitioner who is interested in bring about changes in teachers’ practice within their school. Indeed, interest is already being shown by the Education Endowment Foundation in the COM-B model  and how it can be used to think about interventions could be designed to increase their chance of bringing about behaviour change (Sharples, 2017).

The COM-B Model of Behaviour Change

Michie et al (2011) undertook a systematic review of behaviour change methods and identified 19 different frameworks for behaviour change. Michie et al then synthesised the 19 frameworks into one framework, with two levels, one representing intervention functions and the other high level policy. This led to the development of a Behaviour Change Wheel, which has the Capability-Opportunity-Motivations Behaviour Model at its’ hub. The COM-B model has three core components:

·      Capability - the individual’ s psychological and physical capacity to engage in the activity concerned. It includes having the necessary knowledge and skills. 

·      Motivation - those brain processes that energize and direct behaviour, not just goals and conscious decision-making. It includes habitual processes, emotional responding, as well as analytical decision-making. 

·      Opportunity - the factors that lie outside the individual that make the behaviour possible or prompt it. Michie, et al. (2011)

Each of the components can influence behaviour, for example. capability can influence motivation as can opportunity, just as enacting a behaviour can change  capability, motivation, and opportunity, and is illustrated in Figure 1.  So in the context of research-use within schools, having access to research evidence (opportunity) or being able to understand research evidence (capability) might increase motivation to use research evidence to plan teaching and learning.  However, having the capability, motivation and opportunity is not enough, as the individual teacher needs to act to use the research evidence to improve teaching and learning  

Screenshot 2019-04-03 at 19.30.01.png

However, this only provides a very brief of the model and its components.  Table 1 provides definitions and examples of the components of the COM-B model as applied to the use of research evidence in schools.

 Table 1 – Description of COM-B components and  associated evidence use in schools example 

Screenshot 2019-04-03 at 19.32.17.png

Adapted from Michie, Atkins, et al. (2014)p63

However, it needs be emphasised that the implementation of those examples alone may not be enough to bring about the desired behaviour change.  Indeed, multiple activities or changes in each component may be requited.  Indeed, research by Langer, Tripnet, et al. (2016)on the science of using science in decision-making would suggest that interventions that only focus on only one or two  of the three components are highly unlikely to bring about greater use of research evidence.   Indeed, future blogs will use the COM-B model to examine the various actions which can be taken to support behaviour change. In doing so, we will be looking at the outer rings of the Behaviour Change Wheel i.e intervention functions and higher level policy.

And finally

We need to remember that getting teachers to make use of research is not in itself enough to bring about improved outcomes for pupils.  Instead, we need research literate teachers to be using research, and current research suggests that teachers have ; a weak, but variable, understanding of the evidence-base relating to teaching and learning strategies; a weak, but variable, understanding of different research methods and their relative strengths; and a particularly poor understanding of the evidence-base that requires scientific or specialist research knowledge (e.g. the validity of ‘neuromyths’) (Nelson, et al, 2017)

References

Langer, L., Tripnet, J. and Gough, D. (2016). The Science of Using Science : Researching the Use of Research Evidence in Decision-Making. London. EPPI Centre, S. S. R. U., UCL Insitute of Educations, University College of London.

Michie, S., Van Stralen, M. M. and West, R. (2011). The Behaviour Change Wheel: A New Method for Characterising and Designing Behaviour Change Interventions. Implementation Science. 6. 1. 42.

Michie, S., Atkins, L. and West, R. (2014). The Behaviour Change Wheel. A guide to designing interventions. 1st ed. Great Britain: Silverback Publishing. 

Michie, S., Atkins, L. and Gainforth, H. L. (2016). Changing Behaviour to Improve Clinical Practice and Policy. In   Axioma-Publicações da Faculdade de Filosofia. 

Nelson, J., Mehta, P., Sharples, J. and Davey, C. (2017). Measuring Teachers’ Research Engagement: Findings from a Pilot Study: Report and Executive Summary. London. Education Endowment Foundation/NFER

Sharples, J. (2017). Untangling the ‘Literacy Octopus’ – Three Crucial Lessons from the Latest Eef Evaluation. Education Endowment Foundation Blog. https://educationendowmentfoundation.org.uk/news/untangling-the-literacy-octopus/

 

 

The school research lead, testing interventions and better teaching and learning

As regular readers of this blog will know, I’m a great believer that those of us who are interested in evidence-based education can learn much from the experience of colleagues working in the field of evidence-based medicine.  So I was delighted to have recently come across Evans et al’s (2011) book Testing Treatments: Better research for better healthcare which set out to explore how we ensure that medical treatments best meets the needs of patients.  In particular it looks at how to ensure ‘that research is soundly based, properly done, able to distinguish harmful from helpful treatments, and designed to answer questions that matter to patients, the public, and health professionals.

What I like about the book was how it was upfront about the limitations of research i.e. all it can really do is help us become a little less uncertain about how to proceed.  High quality research can indicate the probability that a particular treatment will lead benefits or harms for the patients, and which is equally true of research into interventions within education.    

Helpfully, at the end of each chapter Evans et al come up with a list key points to think about, which for me, many of which equally applicable in education.  Below is an edited list  key points, which have been amended for educational settings.

  • Testing new teaching strategies/interventions is important because new teaching strategies are as likely to be worse as they are to be better than existing teaching strategies

  • Biased and unfair tests of interventions can leads to pupils losing out and having reduced life chances

  • Just because a major body recommends the approach, does not mean it might not harm pupils

  • The beneficial effects of interventions are often overplayed, and the harmful effects underplayed

  • Neither theory or professional opinion is a reliable guide to effective interventions

  • Just because an intervention or practice has been used for years does not mean that it is effective

  • Even if pupils are not ‘harmed’ by the intervention, using them is a waste of resources

  • More intensive use of an intervention is not necessarily beneficial, and can sometimes do more harm than good

  • Better diagnosis does not necessarily lead to better outcomes, sometimes it makes matters worse

  • Screening programmes should only be introduced on the basis of sound evidence of their effects

  • Dramatic effects of an intervention are rare

  • Uncertainties about the effects of an intervention are common

  • Fair tests of interventions are needed because we will not otherwise sometimes conclude that interventions are useful, when they are not, and vice versa

  • Comparisons are fundamental to all fair tests of treatments

  • A single study rarely provides enough evidence to guide intervention choices in education

  • Assessments of the relative merits of alternative interventions, should be based on systematic reviews of all the relevant, reliable evidence

  • New research should only proceed if an up-to-date review of earlier research shows that it is necessary

  • Much research is of poor quality and done for questionable reasons

  • Input from teachers, pupils and school stakeholders can lead to better research

 Of course, this is not an exhaustive list of what needs to be done to improve the quality of and usefulness of educational research. On the other hand, will help you become more sceptical about some of the claims made about educational interventions. And by being sceptical, I’m not talking about someone who is a continual ‘naysayer’ but rather someone who withholds approval or disapproval until they have made appropriate and rigorous inquiries. 

Reference

Evans, I., Thornton, H., Chalmers, I., & Glasziou, P. (2011). Testing treatments: better research for better healthcare. Pinter & Martin Publishers

 

Disciplined Inquiry as a panacea for performance management - Where's the evidence

This is the link to my session at researchED Blackpool  in which I put forward the following argument

  • Colleagues in research schools (and wider) are showing an interest in disciplined inquiry

  • This is a product of three things

    • Dylan William’s view that all teachers should seek to improve and should  take part in 'disciplined inquiry’

    • Bloggers writing about disciplined inquiry

    • Widespread dissatisfaction with current models of performance management in schools

  • Disciplined inquiry is now being used in a number of schools as an integral part of school’s performance management processes and CPD activities

  • However, this is being done, with little or no reference to the research literature on what makes for effective performance management processes; the relationships between disciplined inquiry and teacher knowledge, skills, attitudes and behaviours and teacher outcomes; and different types of inquiry – such as action research

  • Ironically the promotion of disciplined inquiry as part of performance management is an example of what the evidence-based community is trying to avoid i.e. addressing problems with little reference to the research evidence-base and the adoption of practices promoted by gurus

  • Nevertheless, this does not mean we should not show interest in ‘disciplined inquiry’ as a way of addressing the problems associated with performance management in schools.  

  • Although we should be upfront and say that while the adoption of DI seems a goods idea. there is little or no robust evidence about what works, where, for whom, to what extent, for how long -  when undertaken as part of performance management