Saturday, 25 June 2016

How to really learn from failure (or a relative lack of success)

If you are an evidence-based practitioner, school research lead or senior leader wishing to maximise your learning from failure, then this post is for you.   Given the disappointment surrounding both the Rochdale Research Into Practice and the Ashford Research Champion EEF evaluation reports on the effectiveness of the role of research champions in helping transfer research into practice – there is a pressing need to have a structured approach to learn from failure (or relative lack of success).  With that in mind, this post will draw upon the work of Birkenshaw and Haas (2016) who have recently outlined a three step approach by which organisations can maximise their return from failure
  
Increase your return on failure

Birkinshaw and Haas begin by stating that:    one of the most important and most deeply entrenched reasons why established companies struggle to grow is fear of failure…. (with) a risk-averse culture as key obstacle to innovation (p90).   As such they propose three steps by which an organisation can raise its return on failure.

First, study individual projects that did not pan out and gathers as many insights as possible from them.  Second, crystallise those insights and spread them across the organisation.  Third.  do a corporate level survey to make sure that your overall approach is yielding all the benefits it should

Let’s now look at these steps in turn

Step1 Learn from every failure

The first step involves getting colleagues to look back on interventions and innovations which have not been successful.  Birkenshaw and Haas argue that for many organisations (and individuals) this does not come naturally, with colleagues expressing a preference to look to the future and not back to the past.  To help people ask the right questions about failed initiatives Birkenshaw and Haas have developed a worksheet which identifies all the sources of costs and benefits which might come about from a failed projects.    I have amended this worksheet so that it sits more easily within a school context.


THE PROJECT REVIEW WORKSHEET

Even when initiatives flop, they can still provide tremendous value to your organization—if you examine them carefully and capture the critical lessons. This template will help you do that

Briefly describe a recent failed project or activity you were involved in:

Benefits

Costs
What have we learned about our pupils’, pupils’ parents and staff needs and preferences and our current school context?
Should we change any of our assumptions?

What were the financial costs of the project - staffing, materials and capital?

What insights have we gained into future of the school? 
How should we adjust our school development plan?

What were the external costs? 
Did we hurt the school’s reputation in the local area or nationally?
Have we weakened our position to attract pupils, staff and funding?

What have we discovered about the way we work together? 
How effective are our school processes, structure, and culture?

What were the internal costs? 
Did the project damage school/team morale or co too much attention at the expenses of other projects? 
Was there any cultural fallout?

How did we grow our skills individually and as a team? 
Did the project increase relational trust and goodwill? 
Were any developmental needs highlighted?



Key insights and take-aways



Amended from Birkinshaw and Haas (2016) p92

Step 2 Share the lessons

Birkenshaw and Haas go on to argue that that the real organisational (school) benefit from failure comes when the learning from that failure is shared across the organisation.  As such, it is argued that This requires the organisation (school)  to build in a cycle of review, which allows the lessons from failure to contribute into existing processes.   Furthermore, by having difficult but positive conversations about failure this creates the conditions to generate relational trust, which creates the conditions by which colleagues may wish to be involved in more difficult and challenging projects.   Birkenshaw and Haas argue that organisational leaders need to be brought together on a regular basis to discuss their failures, and they suggest the use of what they call the Triple F process. 
  •         Reviews are FAST and to the point
  •        Take place FREQUENTLY, through good times and bad
  •         Are FORWARD looking, with an emphasis on learning (p92) 

Step 3 Review your pattern of failure

This involves taking an overview to see whether the organisational  approach to failure is making the most of the opportunities for learning.  Is the organisation learning from every new innovation which it has introduced over the last year?  Is the organisation learning from every unsuccessful intervention or innovation?  Are the lessons from failure being shared across the organisation?

And some final words

For those of use interested in transferring research into practice is not going to be a quick and easy task, and is likely to involve more failures than successes.  However, if we are to make the most of the failures (or relative lack of success) , then ensuring that we learn from such failures maybe some of the most important work that we can do. Having a structured approach to learning from failure, may be necessary but not sufficient conditions to help bring about the development of an evidence-based profession. 

Reference


Birkinshaw, J. and Haas, M., 2016. Increase your return on failure. Harvard business review, 94(5), pp.88-93.

Friday, 17 June 2016

School Research Champions and Habit Formation

Introduction

Given one of the goals of Continuing Professional Development (CPD) is to bring about beneficial changes in actions and behaviours and for those changes to be sustained by becoming habits, it seems sensible to consider what the research literature suggests about habit change and habit formation.   In doing so, I will be taking a cross-disciplinary approach and draw upon what has been written about habit formation and change within a health promotion context.  I will then use this framework to help explain one of the main findings – i.e. there was no evidence that teachers were more likely to use research to inform their teaching practice after being involved in the pilot (p4) -of the recent Rochdale Research into Practice Evaluation report and executive summary which was published in May 2016

A cross-disciplinary framework for habit change

Neal et al (2012) state that habits are defined as actions that are triggered automatically in response to contextual cues that have been associated with their performance.   As such, the regular performance of the action creates a linkage between the situation (cue) and the action, so as the cue is encountered or experienced it triggers the action which is performed automatically, for example, automatically taking the register as students enter a classroom, or when asking a question of a class, choosing the first-student who puts up their hand.

Lally et al (2010) worked with 96 undergraduates and asked them to adopt a new health-related lifestyle behaviour.   Of the 82 participants who completed the study, the average time it to took for a behaviour to become automatic was 66 days, although this varied by participant from 18 days to 254 days.  This period of time is much longer than the 20 to 30 days which is often talked about as the time required to bring about the formation of a new habit.

Bringing about habit change

So what if a school research lead or teacher wishes to develop a new behaviour.  Gardner, Lally and Wardle (2012) provide a useful checklist developed for health promotion and which I have amended for use in an educational context. 
  •  Decide on a strategy practice that you would like to embed in your teaching, for which would appear to be robust evidence to support the strategy’s effectiveness
  • Choose a simple action that will contribute towards your goal and which you can do on a daily basis within a lesson, tutorial or other context e.g. questioning and wait-time
  • Plan which lesson or lessons where you will undertake the chosen action.  Try and be consistent and look to find some action which you can repeat daily.
  • Every lesson or session you encounter that time or place, perform the action you have chosen.
  • Continue the action for at least 66 days, by which time the action should have become automatic.

So what are the implications for Rochdale Research into Practice Project and subsequent interventions?  However, before do this I will provide a very brief summary of the Rochdale Research into Practice  project's main finds

Rochdale Research into Practice Project – A summary 


Aim – to pilot intervention aimed at supporting teachers to use evidence-based teaching and learning strategies to improve pupil progress. 

Objectives – To help teachers to: have more positive views about the use of research for improving teaching and learning; apply educational research findings in the classroom and at a strategic development level; and, establish a stronger culture of evidence-based enquiry and practice.

Duration - The project ran for one year (2014/2015) 

Who was involved - Ten primary schools in the Rochdale area, all of which are members of the Inspirational Professional Learning Community Network (IPLCN) and 280 pupils were taught by participating teachers

Delivery -  by a senior Continuing Professional Development (CPD) consultant based at one of the schools and involved the following strands
  • CPD sessions – 3 full days and 4 half-day sessions
  • School visits by the CPD leads
  • On-going implementation
  • Ongoing email and phone advice and guidance by CPD Lead 
  • Collaborative CPD and professional learning conversations 
  • Engagement with senior leadership
Focus- by using evidence- based teaching and learning strategies such as metacognition, self-regulation and feedback


Funding The project was funded through the Education Endowment Foundation (EEF) Research Use in Schools grants round. It was jointly funded by the EEF, the Department for Education, and the Mayor’s London Schools Excellence Fund.

Findings

There were some positive changes in teachers’ attitudes towards research during the course of the pilot.

There was no evidence that teachers were more likely to use research to inform their teaching practice after being involved in the pilot.

The project was very well received by teachers suggesting that this model may be a promising way of engaging teachers in evidence-based practice.

Finding time for working collaboratively on implementing research evidence in practice was considered a challenge, but overall the requirements of the programme were feasible.

Before a trial is considered, further thought should be given as to which elements of the project are essential for its efficacy, and whether a trial should test the project structure as a model for research dissemination or both the structure and content of the project as piloted (p4).

Discussion 

So given apparent both the lack of change by teachers in the use of research to inform their practice leading and question marks about both the structure and content of the pilot, what can we learn from the work of Gardner, Lally and Wardle to inform both the content and structure of future interventions.
  • For research to be used to bring about sustained changes in practice, then such changes may need to be relatively small and easily repeatable on a regular if not daily basis. 
  • To bring about such habit change – CPD programmes needs to be designed to ensure there is almost real-time monitoring and support. CPD programmes – which rely on the colleagues getting together on a relatively infrequent basis or even half-termly are unlikely to bring about habit change.
  • It is unlikely that sustained habit change – even in using small bite-sized changes in the use of research research to bring about changes in practice - can bw brought about quickly and will require sustained practice and support 
  • School and leaders need to be conscious of situational and contextual cues and work to create an environment where those cues promote positive habits and research and evidence use and reduce the numbers of cues and contexts which facilitate less productive habits and behaviours. To use a phrase –borrowed from Margaret Mulholland of Swiss Cottage School – research and evidence need to be baked into that everything a school does – rather than simply being bolted on as an extra requirement. 

And some final words

Although both the Rochdale Research into Practice and Ashford Research Champions appear to be less than successful in transforming research into practice, we should not be too disappointed.  Evidence of what appears not to have worked, is just as important as evidence of what works.  As Christensen and Raynor (2003) note positive research outcomes are very rarely the final word.  Progress comes when researchers refine a theory to explain situations in which the theory previously failed.  As such, those of use interested in developing theories of action associated with evidence-based practice should be grateful to colleagues who participated in the Rochdale and Ashford projects, as their work will hopefully contribute to development of new hypotheses and theories of action.  With this in mind my next post, explain a three-step process whereby schools can maximise their learning from a relative lack of success or failure.


References

Christensen, C.M. and Raynor, M.E., (2003). Why hard-nosed executives should care about management theory. Harvard business review, 81(9), pp.66-75.

Gardner, B., Lally, P. and Wardle, J. (2012) Making health habitual : the psychology of 'habit-formation' and general practice,  British Journal of General Practice. 62 (605) pp 664-666


Griggs, J, Speight, S.,  and Cartagena Farias, J (2016) Ashford Teaching Alliance 

Research Champion Evaluation report and executive summary May 2016 , Education Endowment Foundation : Accessed 8 June, 2016

Lally, P. Van Iaarsveld, C. Potts, H., and Wardle, J. (2010) How are habits formed ; Modelling habit formation in the real world European Journal of Social Psychology 40, 998 - 1009

Neal, D., Wood, W., Labrecque, J. and Lally,P. (2012)  How do habits guide behaviours  ; Perceived and actual triggers of habits in daily life, Journal of Experimental Social Psychology. 48 : 492 - 498

Speight. S, Callanan, M.,Griggs, J. and Cartagena Farias, J.  Rochdale Research into Practice Evaluation report and executive summary May 2016, EEF


Saturday, 11 June 2016

Transferring research into practice - time and commitment are necessary but not sufficient conditions


If you work in a school and are interested how research is transformed into practice, then this post is for you.  In particular, this post will interest school research leads, Heads of CPD/INSET and school leaders who are trying to find ways at developing teaching expertise by supporting the use of educational research in decision-making and teacher practice.  As such, this post will examine the recent Ashford Teaching Alliance Research Champion EEF Evaluation report and executive summary published in  May 2016 .   I will then use an analytical framework developed by Kegan and Lahey (2009) to attempt to explain why one of the report’s key findings i.e. there was no evidence that teachers’ attitudes towards research, or their use of research evidence in teaching practice, changed during the intervention – should not be particularly surprising.   I will then go onto examine the implications of the key learnings arising from the project those wishing to lead the use of research and evidence within schools.

Overview of the Ashford Teaching Alliance Research Champion Pilot Project

The aim of the Ashford Teaching Alliance (ATA) Research Champion project was to pilot intervention aimed at developing teaching expertise and practice by promoting the use of educational research in decision-making and teacher practice. The underlying objective of the intervention was to consider whether, and to what extent, research communication and engagement strategies had the potential to improve teachers’ use of, and attitudes towards, academic research to support pupils’ progress. The intervention ran for the 2014-15 academic year within five school of the ATA. The intervention had four key elements ‘audits’ of needs and research interests for individual schools; a series of research symposia for teachers; termly research and development ‘twilight forums’ (events held at the end of the school day at one of the participating schools); and bespoke research brokerage. Finally, delivery of the intervention was led by a ‘Research Champion’, a senior teacher based at one of the schools who worked with research leads, other teachers, and senior leaders to promote engagement with research evidence.

The main findings
  • There was no evidence that teachers’ attitudes towards research, or their use of research evidence in teaching practice, changed during the intervention.
  • Teachers found the research symposia and twilight events valuable, particularly as opportunities to learn about developments in educational research and reflect on teaching practice outside the classroom.
  • Attendance and engagement in the programme was occasionally low due to time pressures faced by teachers. This posed a serious threat to the feasibility of the programme.
  • A greater commitment from senior leadership teams to fully support staff with release time and classroom cover is likely to be necessary for successful implementation.
  • The programme requires further development before it is ready for a trial. In particular it requires a clearer specification of the key features of the programme in terms of structure, content, and which components are required (p4)
Kegan and Lahey and the Immunity to Change

Kegan and Lahey refer to a recent study which showed that when doctors tell heart patients that they will die if they do not change their habits, as few as one in seven will be able to change their lifestyle habits and change successfully.   Even when the stakes are a matter of life and death the power of the status quo and existing habits and practices can be incredibly attractive.  If this is the case, and given that CPD is often about bringing about much lower stakes change in habits and behaviours we should not be surprised if CPD is more often than not ineffective in bringing about sustained and meaningful change.  So if this is the case, we should not be surprised at the finding of the Ashford Research Champion project that there was no evidence that teachers’ attitudes towards research, or their use of research evidence in teaching practice, changed during the intervention (p4).  Indeed, we should have been extremely surprised if this had not been the case.

Kegan and Lahey argue that both individual and collective beliefs provide a powerful antidote to change. This mechanism is illustrated in the following table - amended from Kegan and Lahey  - which looks at the immunity to change, and which uses an example the integration of research evidence into day to day teaching practice

Commitment
Doing/not doing instead
Hidden competing commitment
Big assumption
We are publicly committed to integrating research evidence into out teaching practice
We continue not to integrate research evidence into our teaching practice.
We value existing teaching practices over those which might be suggested by research evidence
We assume that if we integrate research evidence into practice – it will reveal our lack of understanding of research or that current practice is ineffective

Alternatively the 'immunity to change' process could work in the following manner and which shows how a school's professional learning community may not support the 'research champions' who wish to turn research into practice.


Commitment
Doing/not doing instead
Hidden competing commitment
Big assumption
We are publicly committed to integrating research evidence into out teaching practice
We continue not to integrate research evidence into our teaching practice.
We value our place within the school's professional learning community and wish to act in a manner which is consistent with its culture and values
We assume that if we publicly integrate research evidence into practice – this may leave us professionally isolated within the school community - as other colleagues do not value the use of research evidence



Kegan and Lahey's model of how an individual or group may think gives a profound insight into why many seemingly well supported CPD initiatives fail, i.e there maybe an individual commitment to the change but other things are more important and overpower that commitment.  Given the seeming power of this model to explain the immunity to effective CPD and resultant change may have a number of implications for the reported key learning emerging from the Ashford Research Champion project. So let's examine the proposed key learning in more detail.

Implications of Kegan and Lahey model for the key learning emerging from the Ashford Research Champion project

The project evaluators identified a number of key learnings which developers should take into account when designing future interventions, and these include:
  • Explore ways to ensure participating staff are given regular, dedicated time for the programme—in particular, release time to attend all events and to engage with the brokerage service, and time to plan, implement, and review changes in classroom practice;
  • Foster support from senior leaders at the school—encouraging buy-in from senior leadership teams would lead to more support for staff, including release time and classroom cover, as well as greater likelihood that learning from the project would be shared and taken forward across the whole school;
  • Allow flexibility for schools to tailor strategies to their own context—this was viewed as key to promoting engagement and buy-in from teachers and senior leadership teams; and
  • Provide practical examples and materials that could be used to facilitate classroom implementation, with a focus on simple strategies expected to bring ‘quick gains’. (p5)

However, the Kegan and Lahey model would suggest that all of the above are what could be described as ‘necessary but not sufficient conditions’ for success.  The provision of time, support, flexibility and practical examples of research into practice in themselves will not be enough to bring about changes in both the attitudes toward research evidence and teaching practice.  For want of a better phrase – time, support, flexibility and applied examples – are what could be described as ‘hygiene factors’. 

So what does this mean for those wishing to lead the use of research evidence in schools? Well for me there seems to the three implications.   First, it would be worthwhile for senior school leaders to explicitly work through their “Immunity to Change’ by articulating their public commitments; what they are doing or not doing; identify their competing commitments; and finally, highlight any hidden assumptions that they have towards the use of research and evidence within schools. As in doing so, this may identify those factors which may or may not get in the way of success.  

Second, in doing so, it will be necessary to really think through current conceptions held about the relationship between research, evidence and teaching practice and what that implies  (see the Role of Research Evidence in Educational Improvement).  

Third, it's not enough to think differently, success comes from taking specific steps which are inconsistent with our immunity to change and in doing so, challenges our thinking.  As such the pace of change likely to be erratic, this is messy stuff and individuals need to have safe places to try out behaviours which are inconsistent with their belief systems.

References

Kegan, R and Lahey, L, l. (2009) Immunity to Change : How to overcome it and unlock the potential in yourself and your organisation, Harvard Business Press, Boston.

Griggs, J, Speight, S.,  and Cartagena Farias, J (2016) Ashford Teaching Alliance 
Research Champion Evaluation report and executive summary May 2016 , Education Endowment Foundation : Accessed 8 June, 2016