Friday 25 November 2016

Guest post - Professor Tim Cain - How do teachers use research?

As we approach the end of term and with ever increasing levels of tiredness setting in, it seems to sensible to revisit that old proverb 'a change is as good as a rest'.  So with that in mind, I'm delighted to be able to say this week's offering is a guest post by Tim Cain, who is Professor in Education and Director of the Research Centre for Schools, Colleges, and Teacher Education at Edge Hill University.  Professor Cain is going to share with us his findings from two recent research studies where teachers have engaged with published research.

How teachers use research: two studies
How do teachers use published research? What do they do with it and how does it change their thoughts and actions? These are among the questions that I have addressed through two empirical research studies. Working with two teacher research groups over a 12-month period, I have discovered how some teachers made links from research papers to practical concerns. The studies were identical in their aims and methods; their overarching research question was, ‘How can educational research impact on teachers and teaching?’ 
The research took place in two Secondary schools in the North of England, anonymised as ‘Hilltown High’ and ‘Riverside’. Both Headteachers perceived the need to improve provision for their ‘gifted’ and ‘talented’ (G&T) students, many of whom were not achieving the expected academic standards. They appointed a coordinator and recruited volunteers to join the project – eight teachers from Hilltown and six from Riverside – with the expectation that they would read research articles that I provided for them and, bearing in mind what they had read, would use these papers to inform their own practitioner enquiry. Research around teaching G&T students was presented in the form of three journal articles, which I thought would be accessible to practitioners: Berlin (2009), Rogers (2007) and Tomlinson (2005). Two are authoritative literature reviews and the third is an empirical study. The teachers were told that they could access further research if they wished; in Hilltown, the coordinator provided each teacher with a copy of an Ofsted report about G&T (Ofsted 2009) and one of the teachers sourced and used additional research into teachers’ use of questioning; otherwise, the influence of research on practice came through these three journal articles. During an early meeting, the teachers presented their understanding of the research papers to each other. Thereafter, my role was to support their practitioner enquiries through monthly meetings at which I prompted discussion, chiefly by asking questions about their projects and their use of research evidence.


With their consent, I interviewed the teachers, twice each: once at around the mid-point of the project and once towards the end. Interviews were semi-structured around a few questions, allowing for fairly free-flowing conversations, and the time to explore matters in some depth.  Interviews were audio-recorded and transcribed; data were split into meaningful units for coding. At the conclusion of the research, the teachers wrote brief descriptions of their projects; these were published internally by the schools and also formed part of the research data, along with my field notes of our monthly meetings. In summary, the data included:
        Field notes from 14 selection interviews and 22 monthly meetings
        26 Individual interview transcriptions (each c.30 minutes)
        14 written reports of the teachers’ projects
The teachers’ discussions were fairly wide-ranging, they rarely discussed the research papers directly, except in the first meeting and occasionally, during interviews. Nevertheless, by comparing what they said with what was in the three papers, it was possible to see how the teachers had understood the papers, how this had affected their thinking (or not) and what, if anything, they had done with what they had read.

Findings: how the teachers incorporated the research into their thinking
The teachers incorporated information from the research papers into their thinking by bringing research-generated knowledge into relationship with other knowledge. The process began when teachers asked, possibly subconsciously, ‘does a particular claim in the research paper match my previous experiences?’ The answer to this question seemed to determine their further engagement with the research – whether they dismissed the claim as either implausible or obvious, or whether they continued to include the research in their thinking. Sometimes, they ignored the claim, moving quickly onto other matters. If this did not happen, i.e. if they decided, individually or collectively, to give the research more thought, they,
a)    used concepts from research to develop their own understandings of concepts they had gained practically. For example, from seeing their G&T students as essentially privileged (with ‘gifts’ or ‘talents’) they moved to a position of seeing some of the as possibly not necessarily adapting well to school, being sometimes bored, insufficiently challenged and, although probably having a passion for their particular area of expertise, unwilling to present themselves as able, for fear of peer pressure. In discussion, they explored how research findings were applicable to their schools: how students find it generally acceptable to be seen as G&T in some (mainly practical) subjects but not others; how girls can feel comfortable to be seen as G&T in some subjects where boys cannot; how students do not necessarily like their achievements to be publicly recognized (e.g. in school assemblies).
b)   related research findings to instances from their previous experiences of teaching and being taught. These previous experiences gave them a means to explore and understand the general in the light of the specific and vice-versa, and enabled impersonal and abstract knowledge, generated in unfamiliar contexts beyond the institution, to become useful within a familiar institutional context. Cases usually consisted of individual students or classes. For example, Tomlinson (2005) has an extended passage, explaining why gifted students are ‘anything but formulaic’ in terms of the spectrum of ability they display, their socio-cultural backgrounds, biological characteristics and the presence or lack of additional needs (p. 160). The teachers related this passage to students they had known, and also recognised that, in the words of one teacher, ‘that’s the problems with groups, and groups having names [i.e. labels]. Sometimes you almost wash over them and think, ‘well this works for these students’, rather than thinking of them as individuals’.
c)    imaginatively diffused implications from research into areas beyond those in the original research. Teachers extended knowledge from the research papers imaginatively into their thinking about many topics. This occurred because, although teachers’ knowledge is necessarily widely-focused, each aspect of their pedagogical knowledge inter-relates with others. So, although the research papers focused strongly on G& T students, the teachers discussed what these matters might mean for issues including curriculum, pedagogy, teaching techniques, other students, learning, resources, assessment, behaviour management, leadership, management, policy, accountability and values.

Long, focused discussions and the ‘Third Voice’
In each school, the whole project, including the research texts and the teachers’ practitioner research, can be thought of as one, long, focused discussion, addressing the question, ‘How can we better provide for our G&T students?’ In discussion, the teachers offered their thoughts and opinions, supporting them with evidence from their values and knowledge, particularly experiential knowledge. Sometimes agreeing with each other, sometimes disagreeing, they supported and encouraged, tested and challenged each other. They questioned old ideas and developed new ones, critically examining possibilities. Effectively, the research acted as a voice in this ‘long, focused discussion’. In discussion, each teacher had access to three sets of voices: their own, articulating their values, previous experiences and ways of thinking and acting (the ‘first voice’); their colleagues’, who shared some of these but not others (the ‘second voice’); and the research, which provided an external view (the ‘third voice’).
This ‘third voice’ was never a strong voice; it was always subordinate to the ‘first’ and ‘second’ voices. Some teachers found it old and possibly out-of-date; it was generated in unfamiliar contexts and was perhaps slightly inaccessible. It could be ignored at times, and the teachers did not shy away from criticising it. But sometimes, the research voice was thought about and acted upon. Experience sometimes prevailed but at other times, the teachers’ opinions and habitual ways of thinking changed, as discussion with their colleagues and the research challenged them with perspectives different from their own.
In order to be admitted to the discussion, knowledge from research had to be brought into relationship with other knowledge, usually from the teachers’ previous experiences of teaching and being taught. At least one teacher in the discussion had to find the research knowledge neither implausible nor obvious. Once admitted to the discussion, the research influenced both the content of teachers’ thinking, and their ways of thinking. This sometimes led to practical changes and, when it did, this could be called ‘research informed teaching’. Research which was not brought into the discussion was either ignored or used strategically, to justify pre-determined actions. Sometimes, research fulfilled a confirmatory role, reassuring teachers that their practice accorded with research. 
Within these long, focused  discussions, the research texts gave the teachers material to think about including,
a)    Focuses for inquiry. Most teachers reported that the research projects had encouraged a stronger than normal focus on G&T students and often, the focus raised questions about their practice, in the light of research. For example, the teachers discussed diversity of G&T students, pace of instruction, appropriate curriculum and developing passions (from Tomlinson 2005); perceptions of G&T students, confidence and self-efficacy (from Berlin 2009); challenge, grouping and independent learning (from Rogers 2007).
b)   Challenges to existing thinking and practice. One example of existing thinking was the belief that students’ learning is made more secure by teaching others. This idea was accepted at both schools, and Riverside School had a policy that G&T students would act as ‘Lead Learners’, teaching other students. However, the teachers interpreted a finding in Rogers (2007) as challenging this policy and, in discussion, the teachers agreed that, although their own knowledge was consolidated by teaching others, what is true for teachers, is not necessarily true for students.  
c)    Concepts. One example of a concept is the notion of ‘peer sanctions’ which non-gifted students sometimes apply to their G&T peers (Berlin 2009). Riverside teachers thought that such sanctions no longer occurred in their school but Hilltown teachers were less sanguine, saying for example, ‘some of the G&T students identified, do in some ways want to be under the radar because of the negative social stigma’. Thus the concept of ‘peer sanctions’ sensitised the teachers to possibilities, prompting them to see problems which they did not see previously (Biesta 2007).
d)   Ideas for action. For example, one teacher was motivated, by understanding that  G&T students can need comparatively little practice in new skills, to create new resources for G&T students which did not rely on practising of new skills but on ‘quite philosophical articles and higher order questions’.
There is also evidence that the research papers influenced how the teachers thought. This included,
a)    Becoming more willing to experiment.  For example, one teacher explained that she had been afraid of challenging students in her lessons, in case they did not understand her questions.  However, reading the research had given her permission to do this and she had now done this, even though it had sometimes provoked uncomfortable silences. She remarked, ‘I think it's helped me be a better teacher’.
b)   Becoming more critical. At various times the teachers critiqued the research papers, citing within-research issues, issues around generalizing from research to practice, and non-congruence with personal values as reasons for their ctiticism.
c)    Developing their understanding of evidence. The teachers discussed differences between what they called ‘hard’ evidence (such as test data) and ‘soft’ evidence (such as observation data). Although they started with a preference for hard evidence, and tried to gather hard evidence for their own practitioner enquiry, they moved to a more sophisticated view of evidence which reflected the complexity of educational situations.
d)   Developing ethical awareness. The teachers discussed ethical issues such as paying more attention to G&T students (and possibly less attention to others), grouping G&T students together, and providing activities specifically for G&T students.
Such influence came not only from research findings but the whole papers, including literature reviews and discussions of findings.
The process, including some aspects which have not been fully discussed here, is illustrated below. Further details are available in the published papers (see references).

How robust is this theory?
This theory is based on only two studies, both of which were small-scale. The teachers were volunteers and, as part of the project, were using the published papers to guide their own practitioner research projects. This means that:
These studies have not shown how all teachers will use published research, only how some teachers have used published research.
Further questions arise: how many teachers (for example, as a proportion of the workforce in England) will use published research in the ways described here? Which ones? What are the main factors that determine who does, and doesn’t use research in these ways? What are the factors to do with, a) the research, b) the way in which the research is presented, c) the individual teachers and d) their schools? As always, these studies have posed more questions than answers but they also present a theory which can be investigated further.

References (copies available from the author on request)
Cain, T. 2015a. Teachers’ engagement with published research: addressing the knowledge problem. Curriculum Journal, 26(3), 488-509.
Cain, T. 2015b. Teachers’ engagement with research texts: beyond instrumental, conceptual or strategic use. Journal of Education for Teaching.
Cain, T., 2016a. Denial, opposition, rejection or dissent: why do teachers contest research evidence? Research Papers in Education, ahead of print.
Cain, T., 2016b. Research utilisation and the struggle for the teacher’s soul: a narrative review. European Journal of Teacher Education, ahead of print    

Saturday 19 November 2016

The school research lead, premortems and avoiding the avoidable

As we approach the end of the autumn term, plans are no doubt being developed for new initiatives or innovations which are going to be introduced in the New Year.  Unfortunately, the introduction of an initiative, policy or change within a school often results in failure, maybe in part due to individuals not be willing to speak-up about potential problems with the change or intervention, particularly if its is viewed as a headteacher's 'pet-project.'   So to help with this problem, I'm going to use the work of Klein (2007a) who developed a technique called a premortem, and which happens to be one of Daniel Kahneman's favourite approaches to try and increase the quality of decisions within organisations.

What is premortem?

As Klein states : A premortem is the hypothetical opposite of a postmortem. A postmortem in a medical setting allows health professionals and the family to learn what caused a patient's death.  Everyone benefits except, of course, the patient.  A premortem in a business setting comes at the beginning of a project rather than the end, so the project can be improved rather than autopsied.  Unlike a typical critiquing session, in which project team members are asked what might go wrong, the premortem operates on the assumption that the 'patient' has died, and so asks what did go wrong. The team members' task is to generate plausible reasons for the project's failure. (p1)

In the context of a school or MAT a project premortem comes at the beginning of a new initiative - say a new timetabling system or uniform policy - and works out why the new initiative has not delivered the desired for results and has created a number of negative unintended consequences.  In other words, you have imagined the initiative has failed, and is to the withdrawn.

Why do we need pre-mortems?

Kahneman (2011) argues that in decision-making there is a pervasive optimistic bias, and which is reflected in the following tendencies:
  • perceiving circumstances as more benign than they really are;
  • overstating our own attributes and associated skill levels;
  • setting objectives believing they are more achievable than they are like to be;
  • overconfidence in our ability to predict the future and how events will unfold;
with all these tendencies contributing to an overconfidence in the wisdom of a particular course of action.

Kahneman goes onto argue that as a senior leadership team or project group get closer to making a decision, the expression of doubts about the wisdom of proposed course of action get suppressed.  This is particularly the case, if say the CEO, headteacher or head of department - have already indicated their preference for a particular plan.  As a consequence competing views and opinions are often seen as a lack of commitment to the group.   As such, Kahneman identifies two main benefits of conducting premortems;  first, it legitimises dissent and rewards people for being imaginative and creative; second,  even proponents of the planned course of action are encouraged to look for weaknesses and threats, which not have been so obvious in earlier group deliberations.

How to conduct a premortem?

Klein (2007b) usefully provides an outline process for conducting a premortem, and which I have adapted for use in educational setting  Prior to the final decision to proceed with a project or policy,  a group of people who are knowledgeable about the project - including both the project leader and sponsor - get together for a meeting lasting approximately 45 - 60  minutes

Step 1 : Preparation: Team members take out sheets of paper and get relaxed in their chairs.  They should already be familiar with the plan, or else have the plan described to them so they can understand what is supposed to be happening.

Step 2: Imagine a fiasco.  When I conduct the Premortem, I say I am looking into a crystal ball and, oh no, I am seeing the project has failed.  It isn't a simple failure either.  It is total, embarrassing, devastating failure.  The people on the team are no longer talking to one another.  Our company is not talking to the sponsors.  Things have gone as wrong as they could.  However, we could only afford an inexpensive model of the crystal ball so we cannot make out the reasons for the failures.  Then I ask, "What could have caused this?"

Step 3 : Generate reasons for the failure.  The people on the team spend the next three minutes writing down all the reasons why they believe the failure occurred....

Step 4: Consolidate the lists.  When each member of the group is done writing, the facilitiator goes around the room, asking each person to state one item from his or her list.  Each item is recorded on a whiteboard.  The process continues until every member of the group has revealed every item on their list.  By the end of this step, you should have a comprehensive list of the group's concerns with the plan at hand

Step 5 : Revisit the plan :  The can address two or three items of greatest concern, and then schedule another meeting to discuss the ideas for avoiding or minimising the other problems.

Step 6 : Periodically review the list.  Some project leaders take out the list every three or four months to keep the specter of failure fresh, and to resensitize the team to problems that may just be emerging (Klein, 2007b p 99-100)

What are the implications for evidence-based practitioners and school leaders?

Pre-mortems are an essential tool for the evidence-based practitioner.  Evidence-based practice involves making conscientious, explicit and judicious use of multiple sources of evidences to bring about favourable outcomes  A premortem aids conscientiousness by opening up the decision-making process to both scrutiny and provide access to a range of stakeholder views, and in doing so increase the evidence-base.  Second, pre-mortems aid in making thinking explicit, and the reasons identified for a project's 'failure' will lead to under-pinning data and assumptions to be challenged.  It's a process which has the potential to flush out cognitive biases.   Finally, it's likely to reduce the number of ineffective  decisions - which do not  provide the benefits intended - and increases the chances resources, time and expertise being used to contribute to favorable outcomes


Kahneman, D. (2011). Thinking, fast and slow. Macmillan

Klein, G. (2007a). Performing a project premortem. Harvard Business Review, 85(9), 18-19.

Klein, G. (2007b). The power of intuition: How to use your gut feelings to make better decisions at work. Crown Business.

Sunday 13 November 2016

The school research lead and analysing your school's readiness for innovation.

Many school-based interventions fail because they are introduced at too large a scale.  Interventions or innovations which might well have worked if they been developed in a small-pilot, get rolled out into school with insufficient thought given to the know-how, capacity and staff commitment available to support the intervention.   In this post I draw upon the work of Bryk et al (2015) to help school research leads advise senior leaders within a school about the appropriate scale of implementation when introducing a new intervention within a school.

Sizing up the context

Bryk et al (2015) argue that one of the many reasons why many educational interventions/innovations fails is that insufficient thought is given to the organisational context into which the intervention is to be introduced.  So to help school leaders and others improve their abilities to understand their context better they have developed an organisation framework based on three core critiera.
  1. What is the existing level of know-how about the intervention?  So what does the research say about this intervention, do we know what works, why, for who, and in what context
  2. What is the school's capacity and capability to cope with the intervention? Do staff have sufficient time, opportunity and skills to bring about meaningful changes in their practice.
  3. Engagement to what extent are participants - predominantly teachers - in favour of the intervention. In other words, how willing are teachers do what is necessary to make the innovation work.
Now depending upon how these three 'variables' playout, this will suggest the level of implementation which is right for that school's context.  So it may be there is little know-how, capacity/capability and commitment towards the intervention, so this may mean either the intervention is abandoned or a small number of volunteers are sought.  On the other hand, there may well be extensive know-how on how to make something work, sufficient resources to support the implementation, and wide-scale teacher commitment to the intervention, which might justify a whole-school roll-out.  And of course, there may be levels of intervention which lie in between these two extremes - such as departmental, year group or key stage.

Bryk et al summarise the above in the following table.

Implications for school research leads and school leadership teams.

Bryk et al identify a number of both principles and valued goals which are supported by this approach.

General Principles
  1. Wherever possible, learn quickly and cheaply
  2. Be minimally intrusive - some changes will fail, and we want to limit negative consequences on individuals' time and personal lives; and
  3. Develop empirical evidence at every step to guide subsequent principles.
Valued Goals
  1. Through small tests of change, we develop the technical knowledge to turn a good idea into something that can be actually executed effectively.
  2. We build capabilities among the individuals involved in the early testing.  The practical expertise they develop through working out a set of change ideas become an invaluable resource to coach others as they learn to do the same
  3. As early adopters experience heightened efficiency in their day-to-day or, they also become champions for these changes to be taken up by others. (p120)

Some final words

As Bryk et al argue not every school based innovation has to start out as a very-small or  small-scale intervention.  If the correct conditions are in place re know-how, resources (human, physical and financial) and commitment then whole school implementation may well be appropriate.  On the other hand, more often than not, an analysis of the context of the school will suggest either a very-small or small-scale pilot.


BRYK, A. S., GOMEZ, L. M., GRUNOW, A. & LEMAHIEU, P. G. 2015. Learning to improve: How America's schools can get better at getting better. p120

Friday 4 November 2016

The School Research Lead and Improvement Science - What is it?

Over the last two years, I have argued that the  one of the roles of school research leads is to help teachers to improve not prove.  So with that in mind, it’s appropriate to see what we can learn from the field of improvement science.  Drawing upon Bryk, Gomez. Grunow and LeMahieu 2015 book ‘Learning to Improve : How America’s Schools Can Get Better at Getting Better’  I will illustrate how a process of disciplined inquiry can be combined with networks to scale up promising educational strategies, ’through a process of learning fast to implement well’.  With this in mind this post will:
  • Define what is meant by term Improvement Science
  • Identify six key principles associated with Improvement Science
  • Explore the implications for school leaders and school research champions.
Improvement Science : a definition

(Bryk et al., 2015) define improvement science ….as a methodology that disciplines inquiries to improve practice. Undergirding it is a distinctive methodology about what we seek to learn and how we may come to understand it well.  Particular acts of inquiry are improvement research projects.  These projects aim for quality improvement.  In the context of educations, this refers to the capacity of an organisation to produce valued outcomes reliably for different subgroups of pupils, being educated by different teachers and in varied organisational contexts.  Since improvement research is an interative process often extending over a considerable periods of time, it is also referred to as continuous improvement. (p10)

The Six Principles of Improvement Science

Learning to Improve draws heavily upon work undertaken to bring about quality improvement in health-care, and in particular how to shape systems to bring about the effective interaction of various professionals within a hospital or health-system.  In addition, Learning to Improve draws upon ideas such as: communities of practice, teacher action research, lesson study, developmental evaluation, design-based implementation.  As a result of synthesising thinking and practice within these various fields, Bryk et al (2015) have identified six improvement principles 

1.     Make the work problem-specific and user centred – Quality improvement starts with a single question: ‘What specifically is the problem we are trying to solve?’ It enlivens a co-development orientation. Engage key participants as problem definers and problem solvers from the earliest phases of development through large-scale implementation.

2.     Focus on variation in performance A networked improvement community aims to advance efficacy reliably at scale. Identifying and addressing the sources of variability in outcomes is essential.  Rather than simply documenting ‘what works,’ as in estimating an on-average effect, aim to learn ‘what works for whom, and under what set of conditions.’ Develop the know-how to make innovations success for different students across varied educational contexts

3.     See the system that produces the current outcomes It is hard to improve a system if you do not fully understand how it currently works to produce its results.  Seek to understand better how local conditions shape work processes and resulting outcomes. Use this analysis to explicate a working theory of improvement that can be tested against evidence and is further developed from what is learned as you go.

4.     We cannot improve at scale what we cannot measure Measure outcomes, key drivers and change ideas so you can continuously test the working theory and learn whether specific changes represent an improvement.  Constantly ask: ‘Are the intended changes actually occurring? Do they link to changes in related drivers and desired system outcomes?’ Anticipate and measure for unintended consequences too.

5.     Use disciplined inquiry to drive improvement.  Common inquiry protocols and evidentiary standards guide the diverse efforts of NICs (networked improvement communities).  Engage in systematic tests of changes to learn fast, fail fast and improve fast. Remember that failure is not a problem, not learning from failure is.   Accumulate the practical knowledge that grows out of failure, and to build on it systematically over time.

6.     Accelerate learning through network communities : NICs aim to breakdown silos of practice and research.  They enliven a belief that we can accomplish more together than even the best of use can accomplish alone.  A shared working theory, common measures, and communication mechanisms anchor collective problem-solving.  Organise as a NIC to innovate, test and spread effective practices sooner and faster.(p 172 – 173)

These principles are captured in Figure 1  The new improvement paradigm (Bryk et al p 186).  In this diagram the shaded boxes represent the new paradigm and the unshaded boxes represent where we have moved from.

Figure 1 The new improvement paradigm

What are the implications for school leaders and school research champions?

Bryk et al argue that putting what would appear to be relatively simple ideas into practice will be potentially more challenging than expected.  Bryk et al go onto highlight a number of areas where network initiation teams may struggle, and which are likely to cause very similar challenges for school research leads.

Starting well – make sure you take time to understand the school you are operating in.  What are the fundamental causes of the problems that the school  is facing.  Avoid making assumptions about what the problem is, or identifying solutions before you have diagnosed the problem.  It is also essential to spend time trying to identify the aim of what you are trying to achieve and the desired outputs and outcomes.

Starting Smaller, Learning Fast, and Aiming for Quality at Scale – make sure you start small with pilot groups (though make sure these groups are typical of both pupils and staff).  Make sure you put in place systematic processes to generate learning from the pilot scheme.  Find partners – be it within the school or outside the school – to address the challenges you face, as Bryk et al state – shared challenges may lead to faster learning.

Failure Are a Treasure: Really?  Things going wrong through mistakes being made are almost inevitable when working in a system as complex as a school.  Birkenshaw and Haas (2016) argues to that to make the most failure it is important that schools go through a three stage process; first, learning from every failure; second, share the learning from each failure; third, review the pattern of failure.  And to do this, it is necessary to put structures in place which are fast, frequent and forward looking.

And some final words

The field of Improvement Science potentially provides a different way of thinking about the role of the school research champion, with an emphasis on supporting quick tests of change, and which are subject to iterative on-going development.  Most importantly, there is an unrelenting focus on improving the day to day work of teachers in classrooms, in order to bring about improved outcomes for both pupils and colleagues  


BIRKINSHAW, J. & HAAS, M. 2016. Increase your return on failure. Harvard business review, 94, 88-93.

BRYK, A. S., GOMEZ, L. M., GRUNOW, A. & LEMAHIEU, P. G. 2015. Learning to improve: How America's schools can get better at getting better.