Monday, 29 June 2015

The School Research Lead - How well are you doing in helping others become better evidence-informed practitioners

As the end of the academic year approaches, you may be thinking about your own teaching of evidence-informed practice. In other words, how well have you been doing in supporting your colleagues become better evidence-informed practictioners.  Adapting the work of Straus et al (2011) and using my own definition of evidence-informed practice I will list some of the ways in which you can evaluate yourself as a teacher of evidence-informed practice and then consider the implications for the work of the school research-lead.

A self-evaluation in teaching of evidence-informed practice
  1. Am I challenging myself to constantly develop my own skills as an evidence-informed practitioner?
  2. Am I constantly seeking feedback - to challenge my understanding of my performance in teaching evidence-informed practice?
  3. Am I helping colleagues - both inexperienced and experienced - ask well formulated and focussed questions?
  4. Am I incorporating the asking of well formulated questions in the day-to-day work of the school, for example, in department or school meetings or in 1-2 -1s with colleagues?
  5. Am I teaching and modelling search skills in seeking out the best available current evidence, be it academic research or internal school data?
  6. Am I teaching and modelling critical appraisal skills?
  7. Am I teaching and modelling the creation of best-evidence summaries and appraisals
  8. Am I teaching and modelling the integration of best evidence within my teaching or managerial practice?
  9. Am I teaching and modelling the integration of pupil's and other stakeholder's preferences within my own teaching and managerial practice?
  10. Am I developing new ways of evaluating the effectiveness of my teaching of evidence-informed practice?
  11. Am I working - through the use of evidence-informed practice - to create mutually supportive yet challenging relationships with colleagues within the school?
  12. Am I working to create a mutually supportive yet self-critical network of colleagues in both schools and other settings?
(Adapted from Straus et al, 2011 p213)

Implications/suggestions for School Research Leads

Trying to identify the implications of the above will depend on where the individual School Research Lead is in developing his or her own skills as an evidence-informed practitioner and the level of maturity of the school in supporting an evidence-informed culture.  Nevertheless, it is possible to identify several implications/suggestions worthy of further consideration.
  1. If you have responded to above the questions predominantly with YES answers, then great.  Think about what is working and try and do more of it, and if something is not working, stop doing it and do something else instead.
  2. Avoid being intimidated by the above list of questions, it is unlikely that you will have answered YES to a majority of the questions.  Focus on what you can control and is within your domain of influence.   Start with yourself by posing well-formulated questions.
  3. If you answered NO to any of the questions - then pose yourself the question - Why did I answer NO.  What's going on?  Are there things I could stop doing which are getting in the way of my teaching of evidence-informed practice
Becoming expert as both and evidence-informed practitioner and teacher of evidence-informed practice is not something that happens overnight.  It will take time, effort and patience.  It is not a quick fix for a current problem - but rather is an approach which needs to be embedded in day to day practice.  Teaching is potentially a forty year career, the evidence of what works for our pupils is likely to change over that period of time, keeping up with the evidence and helping colleagues do the same, is less an option more a necessity.


References
Straus, S.E., Glasziou, P., Richardson, W. S. & Haynes, B.R. (2011)  Evidence Based Medicine : How to practice and teach it, (4th edition), Churchill Livingston.


Sunday, 21 June 2015

School Research Leads - How am I doing as an evidence-informed practitioner ?

I've often wondered what goes into making an effective evidence-informed teacher.  I wish I could  tell you what are the magic ingredients for making sure your practice is evidence informed.

My trouble begins when I start thinking about what does an evidence informed practitioner actually do.  Is it reading research articles? Is it attending conferences with notable speakers? Is it about working with colleagues to try out new ideas?  Is it about enrolling on a some form of post-graduate programme of study? Is it about undertaking small-scale action research projects?

Judging from the comments I've received via Twitter over the last 12 months, it would be seem to be about all of these practices, and then some.  So as we approach the end of the academic year, how can you go about evaluating -in a non-judgmental manner - your performance as an evidence-informed practitioner.  One approach is to return to the literature on evidence-based medicine and see what evaluative tools are available for adaptation for use in a school-setting.  Fortunately Straus et al (2011) provide a range of questions which can be used for that purpose. But first, let's revisit the 5 steps which make up the practice of evidence-based medicine (and revised for education).

Step 1 - converting the need for information into an answerable question.

Step 2 - tracking down the best evidence with which to answer that question.

Step 3 - critically appraise the evidence for its validity impact and applicability.

Step 4 - integrating our critical appraisal with our teaching expertise and taking into account our pupils's values, circumstances and preferences - alongside the views of other important stakeholders.

Step 5 - evaluating our effectiveness and efficiency in executing steps 1 to 4 and seeking ways to improve next time (Adapted from Straus et al 2011, p3).

So let's now see how these questions can be used to help create a range of self-evaluative questions.

Self-evaluation 

Self-evaluation of converting the need for information into an answerable question
  1. Am I asking questions about either my teaching and/or management practice?
  2. Am I asking, by using either the PICO or CIMO technique, focused and answerable questions?
  3. How do I go about systematically identifying the gaps in both my knowledge and skills?
  4. If stuck, how do I go about 'unstuck' when asking questions?
  5. How do I go about saving my questions for future investigation.
Self-evaluation of tracking down the best evidence with which to answer that question.
  1. Am I searching at all for 'research evidence' relevant to my teaching/managerial practice?
  2. Am I searching for evidence from a range of sources - school data, pupils and other stakeholders?
  3. Do I know the best sources of evidence relevant to my teaching/managerial practice?
  4. Do I have easy access to this evidence?
  5. Am I becoming more efficient in searching for evidence?
  6. Am I making good use of google scholar or ERIC?
  7. How do my searches for evidence compare to to school librarians, school research leads, senior colleagues who are passionate using research evidence to inform their practice?
Self-evaluation critically appraising the evidence for its validity impact and applicability.
  1. Am I critically appraising evidence - be it research evidence, internal school data?
  2. Am I becoming more effective in applying 'critical appraisal techniques' (for example Wallace and Wray) when reviewing the research evidence?
  3. Am I becoming more knowledgeable about some to the key terms used in educational research, for example, effect sizes and confidence limits
  4. Am I creating summaries of critically appraised topics?
Self-evaluation in integrating our critical appraisal with our teaching expertise and taking into account our pupils's values, circumstances and preferences - alongside the views of other important stakeholders.
  1. Am I integrating critical appraisals into my teaching/managerial practice?
  2. Am I adapting the findings of my critical appraisals to meet the needs of my pupils/colleagues?
  3. Can I explain (and resolve) differences in 'what the evidence says' and my teaching and management practice?
Self-evaluation of changing teaching or management practice
  1. When the evidence suggests a change in teaching/management practice am I identifying 'inhibitors and nourishers' for this change.
  2. Have I identified a clear and deliverable action plan to implement this change?
  3. Have I evaluated the impact on pupil learning/outcomes of any changes in practice?
  4. Am I considering the sustainability of any changes I am putting into practice?  
(Adapted from Straus et al 2011, p206 - 209).
Additional questions could be devised and added to this list, questions which are more relevant to your context and the development of an evidence-informed culture within your school.  However, it is important to acknowledge this set of statements are part of a process of becoming an evidence-informed practitioner.  If this is your first year in seeking to integrate evidence to inform your practice, it may be just becoming better at devising well formulated and answerable questions is where your self-evaluation should both begin and end.  Remember, no-one is perfect and will be doing all of the above, what matters is that you are engaging in a meaningful process to improve your practice.


References
Straus, S.E., Glasziou, P., Richardson, W. S. & Haynes, B.R. (2011)  Evidence Based Medicine : How to practice and teach it, (4th edition), Churchill Livingston.

Wallace, M. and Wray, A. (2011) Critical Reading and Writing for Postgraduates (2nd edition), Sage, London






Sunday, 14 June 2015

The School Research Lead and 'Disciplined Inquiry'

At researchED 2014 Dylan Wiliam said:  'All teachers should be seeking to improve their practice through a process of ‘disciplined inquiry.’  Using for guidance the work of Cronbach and Suppes (1969)*,  I will seek to provide School Research Leads and teachers with:
  • a definition of the term 'disciplined inquiry'; 
  • an explanation of the difference between 'decision-oriented' and 'conclusion oriented' inquiry; 
  • a consideration of the implications of the above for School Research Leads in supporting evidence-informed teacher inquirers.
Disciplined Inquiry Defined

Cronbach and Suppes deliberately avoid providing a narrow definition of the term disciplined inquiry.  They do so to prevent disciplined inquiry being conflated with quantitative research, whilst at the same time wishing to embrace the potential of qualitative research.  By doing so, inquiry is not limited to scientific inquiry but can be expanded to include a broader range of inquiry methods.    As such Cronbach and Suppes state that the purpose of a disciplined inquiry is to ...  out to answer  a rather narrowly defined question.  The specific of such inquiries are usually less important that the conceptualisations they generate(p.14)

However, the narrowness of focus does not mean a restrictive approach to methods.  Cronbach and Suppes state:

Disciplined inquiry does not necessarily follow well established, formal procedures. Some of the most excellent inquiry is free-ranging and speculative in its initial stages, trying what might seem to be a bizarre combination of ideas and procedures, or restlessly casting about for ideas (p. 16).

Nevertheless, Cronbach and Suppes go onto identify the elements of disciplined inquiry which distinguishes it from less disciplined forms of inquiry by stating. 

Disciplined inquiry has a quality that distinguishes it from other sources of opinion and belief. The disciplined inquiry is conducted and reported in such a way that the argument can be painstakingly examined. The report does not depend for its appeal on the eloquence of the writer or on any surface plausibility, (p. 15). 


Accordingly, the following would appear to be the essential components/features of disciplined inquiry
  • Ultimately a narrow focus and question for inquiry
  • The possible use of a wide range of inquiry methods
  • Is subject to focussed self-review to identify potential weaknesses in the inquiry and to identify mitigating actions
  • Has internal consistency
  • Is undertaken and constructed in such a way that allows for external scrutiny

So how do we categorise different forms of inquiry, and it is to that to which I now turn.


Distinguishing between decision-oriented and conclusion-oriented inquiry

Cronbach and Suppes argue that is it necessary categorise different types of inquiry primarily to identity both the different focuses of inquiry and the conditions in which they take place.  In doing so, they recognise that any categorisation may be arbitrary and that certain inquiries may may straddle the boundaries between the categories.  

Decision-oriented inquiry 

In this form of study/inquiry, Cronbach and Supes state that ...  investigator is asked to provide information wanted by the decision-makers: a school administrators, a government policy-makers, the manager of a project to develop a new biology textbook or the like.  The decision-oriented study is a commissioned study.  The decision-maker believes that needs information to guide his actions and he poses the questions to the investigator. (p20)  

Conclusion-oriented

Cronbach and Suppes state that conclusion oriented inquiry .... takes its directions from the investigartor's commitments and hunches.  The equational decision-makers can, at most, arouse the investigator's interest in a problems.  The latter formulates his own questions, usually a general one rather than a questions about a particular institution  The aim is to conceptualise and understand the chosen phenomenon; a particular finding is only a means to an end. (p21)

If one sets asides 'who/whom' commissions the inquiry, the distinction between decision-oriented and conclusion-oriented inquiry is potentially useful as it help separates evidence-informed practice and broader conceptions of research.  The former involves undertaking an inquiry using the best available current evidence in order to help make a decision which will hopefully benefit pupils/colleague.  Whereas, conclusion oriented inquiry is more in tune with the notion of research for understanding


Implications for School Research Lead seeking to support evidence-informed teacher inquiry

Initial reflection would suggest there are a number of ramifications for the work of the School Research Lead. 

Focus

Given that disciplined inquiry ultimately has a narrow focus and question for inquiry, developing the skills to help colleagues to devise well-formulated question and answerable questions will be essential component of the School Research Lead's role.  This is particularly the case if the purpose of the inquiry is to lead to decisions about practice.  Accordingly, it will be helpful for researchers to be comfortable with the difference between background and foreground questions and how both the PICO and CIMO formats may assist in this process.  By having a clear and answerable question this will hopefully will create the conditions whereby inquiry can be seen as both a legitimate and feasible component of colleague's work.

Challenge

A key aspect of the School Research Lead's role will be to develop their own inquiry skills in order to help colleagues unpack the elements of his or her own inquiry.  A previous post on 'humble inquiry' looks at this in more detail, but the key aspect of this work will involve the school research lead patiently seeking to understand a colleague's perspective on an issue/task requiring some form of decision.  Helping a colleague get reading to understand the value of disciplined inquiry may take some considerable time

Trust and culture

A necessary but not sufficient condition for disciplined inquiry, is an open, trusting and school culture, which creates the conditions whereby colleagues feel comfortable in sharing evidence, ideas and interpretations of data.   Disciplined evidence-informed teacher inquiry is unlikely to happen where colleagues do not feel they are able to share evidence, ideas and thinking in  a safe and supportive environment.  The School Research Lead has a vital role in creating and maintaining such a culture.

Some final comments
Engaging in disciplined inquiry should be seen essential part of the work of all teachers as they seek to improve their practice.  As Dylan William argued at researchED 14 this could involve:  sharing work with colleagues; writing up their work for wider publication; studying for post-graduate degrees; or finally undertaking research  But for me, no matter in what was disciplined inquiry is pursued it needs to be focused, transparent, self-critical, internally consistent and be capable withstanding rigorous  external scrutiny.  

References

Cronbach, L. J., & Suppes, P. (Eds.). (1969). Research for tomorrow’s schools: Disciplined inquiry for education. New York: MacMillan. This is a report of a special committee of the National Academy of Education. It includes a detailed discussion of disciplined inquiry, a number of historical case studies of educational research programs and a set of policy recommendations.

* cited in Shulman, L.  (1997) Disciplines of Inquiry in Education: An Overview in Jaeger, R. (ed) (1997) Complementary Methods in Complementary Methods for Researchers in Education, American Education Research Association, (pp 3-19)



Sunday, 7 June 2015

Joint Practice Development - How to avoid wasting the hours but saving the 'minutes'

Teaching School Alliances are encouraged - as part of the R&D strand of the Big 6 -  to take part in Joint Practice Development (JPD).  Unfortunately,  this may lead to teachers and senior leaders attending a series of well-intentioned but ultimately ineffective meetings. As senior leaders seek to make poorly designed collaborative activities - they end up wasting the hours but keeping the '(meeting) minutes'.  To help senior school leaders and teachers make the most of the opportunities that arise from collaborative activity this post will examine
  • What we mean by Joint Practice Development?
  • Collaboration traps and how collaboration goes wrong
  • Disciplined collaboration and possible solutions to those collaborative traps
  • High performance arising from decentralisation and collaboration
What is Joint Practice Development?

In a review of the evidence of effective approaches to JPD, Sebba et al (2012use Fielding et al's (2005) definition of  JPD, i.e. ... the process of learning new ways of working through mutual engagement that opens up and shares practices with others.  Fielding et al argue that because this involves development of new ways of working and exchanges of practices, this distinguishes it from 'vanilla' knowledge transfer - where knowledge exchange/transfer may or may not lead to subsequent use.   Sebba et al go onto identify ten processes associated with effective JPD.
  • Clearly articulated aims and improvement priorities
  • Developing trust
  • Building on existing networks and relationships
  • Developing effective networks
  • Recognition of respective roles and contributions
  • Multilevel (distributed) and multisite leadership
  • Challenge and support
  • Knowledge that meets local needs
  • Student participation in decision-making and governance
  • Addressing competing priorties
Sebba et al provide a useful 'road-map'  for what actions to take to create the conditions for effective JPD.  However, being successful isn't always about what you have done, it is often the decisions you have made which led to you not doing something.  The rest of this post will now look at the things to avoid and what strategies can be adopted to promote successful collaboration/JPD.

Collaboration traps

Hansen* (2009), argues that organisation (school) leaders often sabotage their own efforts by promoting more collaboration than is actually necessary, by falling into one or more of the following traps.
  • Collaborating in Hostile Territory - some clusters of schools are just not set up to collaborate, and JPD projects soon 'hit the dust'.  This is not surprising given that the current environment in which schools operates fosters competition and independence.
  • Overcollaborating - sharing good practice always provide a good excuse for a new meeting or the establishment of a new network for sharing good practice.  However, there is the danger that colleagues attend meeting after meeting, and with these meeting having little impact on pupil outcomes or professional learning.  
  • Overshooting the potential value - it's easy to believe that there are huge benefits for co-operating across a range of schools - yet the challenge is identify those areas where those benefits can be delivered.  There maybe potential for synergy but that synergy may not be of crucial importance to either pupil outcomes or colleagues's learning.
  • Underestimating the costs - collaborative projects may deliver outcomes of merit, though that does not mean the outcome has sufficient worth.  Any desired changes in outcomes must take into account the costs associated with bringing about those benefits.
  • Misdiagnosing the problem  - sometimes leaders misdiagnose the reasons why people don't collaborate or work together.  It might be assumed that people don't want to collaborate because the time commitment or level of knowledge required.  On the other hand, it may be the case that they just don't want to collaborate.
  • Implementing the wrong solution - there is a danger that to facilitate collaboration specific IT solutions are put in place to 'capture and share knowledge.'  This may be ok if colleagues know what they are looking for, but if they don't want to collaborate, it may require different solutions.
Disciplined collaboration

Given the challenges which need to be overcome to achieve effective collaboration, Hanson goes onto define 'disciplined collaboration' as : the leadership practice of properly assessing when to collaborate (and when not to) and instilling in people both the willingness and ability to collaborate when required. (p15).  

Hanson argues to achieve disciplined collaboration it is necessary to for leaders (headteachers, school research leads etc) to take three steps

Step 1 : Evaluate the opportunities for collaborations by asking:
  • Is there a substance and significant upside to working together?
  • Will the collaboration/JPD lead to better results for our pupils, staff and school?
In other words, collaboration/JPD is a means to achieve particular outcomes, it is not an end in itself.

Step 2 : Spot the barriers to collaboration - What are the barriers to people collaborating well

Hanson identifies four key barriers to collaboration and JPD.
  • The not invented here barrier - (school staff are unwilling to reach out to others).
  • The hoarding barriers (people are unwilling to provide help)
  • The search barrier (people are not able to find out what they are looking for)
  • The transfer barrier (people are not able to work with the people they don't know well
Step 3 : Tailor solutions to tear down the barriers

Having analysed the problem is then possible to begin to identify the solutions to each of the barriers.  Hanson argues that in tailoring the solutions three broad levers are available :
  • The Unification Lever - crafting compelling goals, which both value and promote collaborative work.
  • The People Lever - getting the right people to collaborate on the right JPD opportunities, and will require the development of what is described as T-shaped leadership and management.  School leaders will need to develop the skills to lead and support JPD opportunities both within a school and between schools.  Recruitment and reward systems need to emphasise the importance of collaborative work.
  • The Network Lever - facilitating networks both within, between and outside of the schools which provides access to knowledge and skills which can help deliver meaningful results for both pupils and staff.
Disciplined collaboration : High performance from decentralisation and collaboration

The work of both Sebba and Hanson indicate that effective JPD/collaboration requires distributed leadership, rather than a top-down model of hierarchical control.  Hanson has devised a useful framework to help headteachers and school leaders to locate their JPD/collaborative across two axes : performance from collaboration; performance from decentralised work (Figure 1)

Disciplined collaboration : High performance from decentralisation and collaboration









Conclusion

Effective collaboration between schools requires high-level leadership and management skills.  It is not something that is going to happen just because it's deemed to be intrinsically desirable.  A clear sense of purpose needs to be created, with clearly definable outputs and outcomes.  Staff involved will need to have the skills to allow them to operate effectively horizontally across schools, whilst at the same time still being able to deliver the 'day-job'.  Finally, many of the solutions to the problems emerging from collaboration may be found from networks.  However, it is not always the 'strong' networks which provide the solution, but rather it's through the loosely-coupled network from outside of the participating schools.

Note
* although written primarily with internal collaboration in mind, the model is equally applicable to collaboration across schools and other organisations.

References

Hansen, M. (2009) Collaboration : How leaders avoid the traps, create unity, and reap big results.

Joint practice development (JPD) Schools and academies:  What does the evidence suggest are  effective approaches? Judy Sebba, Phillip Kent, Jo Tregenza, University of Sussex, School of Education and Social Work (2012)

Fielding, M, Sebba, J & Carnie, F, 2007, Portsmouth Learning Community, Falmer, University of Sussex