Saturday 24 February 2018

Teaching Staff Turnover and Employee Engagement

During this week's #UKEDResChat discussion a number of Tweets mentioned concerns about levels of teaching staff turnover and how to go about creating reliable and valid measures of staff satisfaction and engagement.  So with that in mind, I thought I'd have a look at Bamford and Worth (2017) and their report for the NFER as to the reasons why teachers leave the teaching profession.   I will then focus on one recommendation of the report i.e. the need for schools to measure job satisfaction and engagement and intervene -  to show how that might be easier said than done.

Why do teachers leave the teaching profession?

Drawing on data collected from 40,000 households as part of the Understanding Society longitudinal study, Bamford and Worth (2017) found the following.

  • More than half of non-retiring teachers who leave remain working in the education sector.
  • Teachers do not leave for higher- paid jobs: overall pay decreases, but hourly wages stay the same.
  • Leavers' working hours decrease and many secondary leavers take up part-time positions.
  • Leavers' job satisfaction and subjective well-being improve after leaving. 

Bamford and Worth then go on to make the following recommendations

  • School leaders should regularly monitor the job satisfaction and engagement of their staff, and intervene 
  • Government and other secondary-sector stakeholders need to urgently look at ways of accommodating more part-time working in secondary schools 
  • School leaders, Government and Ofsted need to work together to review the impact their actions are having on teacher workload, to identify practical actions that can be taken to reduce this 

An evidence-based approach to monitoring job satisfaction and engagement 

Monitoring job satisfaction and engagement and susbequently intervening may seem a very sensible and obvious recommendation.  However, it may be a lot easier said than done .  So to help understand why this might be case I'm going to look at the work of  Briner (2014) who raises some very pertinent questions about employee engagement.  So here goes:

Defining engagement - unfortunately there is no one agreed definition of engagement. 

The consequence of this is as Briner states: From a practical (and academic) perspective the absence of agreement about what something means - and an absence of concern about that lack of agreement - is not funny or weird or cute or unfortunate or inconvenient. It's a confused, confusing and chaotic mess that is almost bound to lead to messy and undesired outcomes. It means that whenever we talk about or think about or try to measure 'engagement' we are almost certainly saying different things, understanding different things, measuring different things and doing different things but believing quite incorrectly they are all the same. 

Measuring engagement - if there is no agreement about the nature of employee engagement the chance of developing valid, reliable and meaningful measures are slim. 

Again as Briner states: As a consequence of confused definition and overlap with other existing ideas there is currently little evidence that engagement measures are particularly valid or reliable. There is one crucial form of validity - predictive validity - for which there seems to be almost no evidence at all. This form of validity is essential as it explores whether measures, in this case of engagement, actually predict anything important in the future. At the present time therefore we do not have enough good quality evidence to allow us to draw even tentative conclusions about whether or how engagement can be measured in a valid and reliable way.

Engagement is nothing new or different 

Briner poses two questions about whether engagement is a new or different concept

Engagement is not a new and different idea: If this is the case then the term and idea should be immediately discontinued because using a new term to describe existing concepts is confusing and unhelpful. 

Engagement is a new and different idea: If this is the case then there is a huge amount of work to be done first to define engagement in a way that shows precisely how it is new and different and second to gather good quality evidence to show that measures of engagement are measuring something new and different. 

There is lack of good quality evidence about employee engagement

As Briner states '

There is almost no good quality evidence with which to answer the most important questions about engagement:
Fundamental Question 1: 'Do increases in engagement cause increases in performance?' 
Fundamental Question 2: 'Do engagement interventions cause increases levels of engagement and subsequent increases in performance?' 

Over-claiming and misclaiming

Briner argues that these four challenges raise serious challenges about the usefulness of the idea of employee engagement.  Nevertheless, there is an additional challenge;

That the proponents, supporters and advocates of engagement both over-claim by exaggerating the quantity and quality of evidence and mis-claim by making statements about engagement that, on closer inspection, seem to be about something else.

What are the implications of this discussion for school leaders who wish to monitor job satisfaction and engagement.

  • It will be a waste of time and resources for the school to try and develop its own valid and reliable measures of employee engagement
  • Staff surveys are highly likely to tell you very little, indeed as Argyris (1994) states may even get in the way of learning what needs to be done.
  • Multiple proxy measures of employee engagement are going to be required to help school leaders make a judgement about employee engagement.

And finally

How school leaders tackle the challenge of employee engagement comes down to a choice as to type of school leaders they want to be.  Are they school leaders who carefully examine the evidence on a particular, being explicit about what they know or don't know and then act accordingly. Or do they want to be school leaders who are not overly bothered about the quality of the evidence, subsequently misclaim and misrepresent the evidence for their own purposes and come up with superficial solutions to complex issues.  The choice is yours! (Amended from Briner)


Argyris, C. (1994). Good Communication That Blocks Learning. Harvard business review. 72. 4. 77-85.
Bamford, S. and Worth, J. (2017). Teacher Retention and Turnover Research. Research Update 3: Is the Grass Greener Beyond Teaching? Slough. NFER
Briner, R. (2014). What Is Employee Engagement and Does It Matter? An Evidence-Based Approach. The Future of Engagement Thought Piece Collection. 51.

Sunday 4 February 2018

The School Research Lead and Teacher Journal Clubs - Summarising the evidence

In this post I look at how a school research lead might wish to summarise the evidence about teacher journal clubs.  In doing so, I will try and make we have a format that allows the including of four sources evidence and also takes into the context of the individual.  However, given the workload pressures, it is recognised that whatever report or document is produced, can be produced relatively quickly and without being burdensome.  As such, whilst the example uses a Word based tabular format, the same information could also be presented on 10 - 12 slide PowerPoint or through the use of some kind of mind map

The template

The following example has been produced for a fictional school, which is considering introducing a teacher journal club into its professional learning programme.  As will be seen from the example, the school is relatively new to research and is just beginning to put its 'toe in the water'.

 Teacher Journal Clubs
 Background question

How can teacher journal clubs contribute to teacher professional learning and the use of evidence-based practice?
 Teacher journal clubs appear to have the potential to contribute to the increased use of evidence-informed practice.   Initial discussions with stakeholders suggest there is support for piloting a journal club within the school.   Although, no-one within the school – be it teaching assistants, teachers and senior leadership – have experience in running journal clubs, adequate resources are available on the internet to support their introduction. 
Description of the best available evidence
Although there appears to be no systematic reviews in educational settings about use of teacher journal clubs, a systematic reviews in a health setting (Deenadayalan et al., 2008) provides guidance on how to run a successful journal club.  This guidance suggests: regular and anticipated meetings, mandatory attendance, clear long- and short-term purpose, appropriate meeting timing and incentives, a trained journal club leader to choose papers and lead discussion, circulating papers prior to the meeting, using the internet for wider dissemination and data storage, using established critical appraisal processes and summarizing journal club findings. (from abstract)
Recent research in education (Sims et al., 2017)involving two 11-18 mixed secondary schools (Ofsted – outstanding) indicates that journal clubs are a viable, scalable model of teacher-led professional development, capable of creating sustained increases in evidence-informed practice 
School Data
The school is a mixed 11-18 school and is currently rated by Ofsted as good.  The school has an extensive programme of professional learning – though little or none is focused on research use.  The school has recently recruited a number of new staff who are at the beginning of their career.  However, there are also a number of staff who have been at the school for over twenty years. Although in recent years the professional learning budget has been squeezed – there is still sufficient time in the programme for half-termly journal clubs

Stakeholders’ views (pupils, staff, parents, community)
 A number of teachers within the school are active on Twitter and are aware that the school currently provides few opportunities for teachers to engage in research evidence.  Successful schools in the locality have introduced journal clubs and it is perceived that this has contributed to those schools’ reputation for innovation.  However, there are other teachers who do not see the value of educational research and are aware of schools which have introduced journal clubs – and then have quietly dropped them after a year.  Nevertheless, there is a general consensus amongst the teaching staff that it may be worth undertaking a small pilot with volunteers.

Practitioner expertise – key leaders
 None of the major decision-makers within the school – the HT, 2 DHTs and the newly appointed School Research Lead (SRL)– have experience of running or participating in a journal club.  However, the SRL has attended a number of researchED events and has seen presentations on how to successfully run a journal club.  The SRL is also aware of resources available on the Internet and produced by teachers – which give clear advice on how to ensure a journal club is successful.  In addition, the SRL is currently studying for a post-graduate degree in education.

Questions for consideration

·      Can we access suitable research journals?
·      How do recruit volunteers for the pilot?
·      Do teachers have the capacity and capability to understand and apply research findings?
·      Do we have someone of sufficient knowledge and expertise to lead the journal club?
·      Can desired changes in teaching practice can be identified?
·      Is sufficient time available for the implementation of journal club?
·      How will the impact of the journal club be measured?

References and resources

·      (Deenadayalan et al., 2008)
·      (Sims et al., 2017)


·      School research lead

·      To be shared by email and to be discussed at the next staff meeting
·      Prior discussion of paper at departmental meetings

Update and review

·      When is it likely that new relevant evidence be available?
·      During 2018 as reports on the efficacy of Research Learning Communities and the School Research Leads are published by the EEF.
·      End of the academic year