Monday 30 March 2015

We need to talk about researchEDthics - School Research Leads and Ethical Research and Evidence-Based School Cultures

This post was first published in March 2015

At the recent researchED Research Leads conference held in Cambridge, the 'ethics' of how school-based and led research should be conducted emerged as a key topic for discussion.   In this post I will explore some of the ethical issues associated with the role of the School Research Lead and school-based and led research. In doing so, I hope the post will generate, both discussion and debate amongst the burgeoning number of School Research Leads about the ethical challenges of conducting both school-based research and evidence-informed practice. This attempt to promote this much-need discussion is framed by the following questions.
  • What do we me mean by research? 
  • What are the implications for school-based research of conducting research within an ethical framework? 
  • What is the difference between research and evidence-informed practice? 
  • What are the ethical considerations which need to be considered when undertaking evidence-informed practice? 
What is meant by research?

This section draws upon the work of the National Research Ethics Service (NRES) and the National Patient Safety Agency guidelines: Defining Research. As such, research will be defined as the process of creating new generalisable knowledge, and which could include both the generation and testing of hypotheses. The NRES publication goes to state that research may involve some, it not all, of the following activities (amended for use in schools):
  • defined aims and objectives 
  • the testing or development of hypotheses 
  • comparing interventions 
  • collecting quantitative and qualitative data over and above data generally collected within the school - though could include normally collected data 
  • significant change to teaching approaches/support strategies - which are in addition to what is provided for pupils 
  • allocation of pupils to intervention groups 
  • the use of a sampling framework to decide who is within the study 
  • any form of intervention involving some form of randomnisation. 

What are the implications for school-based research of conducting research activities within an ethical framework?

  • There would appear to be a lack of understanding of the difference between research, quality improvement and evidence-based practice.
  • Much of the 'so-called' research or evidence-informed practice being undertaken in schools, could be classified as quality improvement
  • However, ethical considerations are just as relevant to quality improvement as to research
 However, it is unrealistic expect all changes to teacher's practice to be subject to ethical approval, and I now turn to the difference between research and evidence-informed practice.

What is the difference between research and evidence-informed practice?

In the context of nurse education, Carnwell (2001) identified the following differences between research and evidence-based practice. These differences are illustrated in Table 1, and have been amended for a school context.

Table 1 - The differences between research and evidence-based/informed practice

Amended from Carnwell (2001, p58)

As such, the main difference between research and evidence based practice is, evidence-based/informed practice is concerned with the USE of the best current available evidence to inform practice, taking into account the needs and wishes of the learner/pupils. Whereas research involves an attempt at the PRODUCTION/CREATION of new generalisable knowledge, which can be applied in a range of contexts.

What are the ethical issues associated with evidence-informed practice?

Accepting the distinction between research and evidence-based/informed practice, a number of ethical issues need to be addressed whilst engaging in evidence-informed practice. Again, I will turn to the evidence-based medicine movement for initial insight and the four principles which inform medical-ethics. In the following paragraphs the four principles of medical ethics are 'mapped' against a school setting.


The ‘justice’ principle requires the burdens and benefits of new teaching approaches or innovations to be distributed equally among all pupils or staff within the school. The ‘justice’ principle requires evidence-informed practice to uphold existing legislation, and to be fair to all pupils and staff.


This principle requires the action or intervention being proposed is being done with the intent of doing good for the pupil, member of staff or school.


This principle requires an action or intervention not to do harm to the pupil, member of staff involved or other stakeholders. Educators are required to take a wider perspective and consider whether actions may have unintended negative consequences, and which are subsequently experienced by others.


This principle requires the pupil or member of staff have autonomy of thought and action when involved in decision-making. With pupils, and given the nature of teaching and learning it is difficult to imagine a situation with young children, where it is possible to envisage fully-informed consent in all circumstances. However, when projects involve other colleagues, this seeking their consent is both, principled and practical.

What are the implications of these principles for evidence-informed practice?

The implications of the above principles are .
  1. For teachers to ethically engage in evidence-informed practice they must be explicitly aware of an appropriate ethical framework. 
  2. Teachers to give active consideration to the application of that ethical framework prior to any changes in practice arising from an investigation of the best current available evidence. 
  3. Ethical considerations to be explicitly noted in some form; be it a note on a lesson plan; referenced to within a scheme work; or recorded in some way, be it a reflective diary or log. 
  4. Any processes for the recording of ethical issues need to be both, proportional and fit for purpose. 
To conclude:

In this post I have argued that the leadership and management of ethical issues is an integral part of the school research leads role. Second, if schools are to engage in research - as defined as seeking to create new and generalisable knowledge, including the testing or development of hypotheses - a clearly defined ethical approval process must be applied within the school. Third, evidence-informed practice involves making use of the best-available current evidence, and which distinguishes it from the prior definition of research. Fourth, teachers engaged in evidence-informed practice must be aware of ethical issues, and engage in a process of evidenced self-regulation. Fifth, whilst the enthusiasm for teachers to engage with research should welcomed and developed, it is essential colleagues use appropriate processes - this being a case of in order to do the right thing it is also essential to do things right. Sixth, whilst conceptual and operational clarity is being developed in relation to ethics, research and evidence-informed practice, teachers should focus on developing their skills as critical consumers of research. Finally, I hope this post is helpful in generating debates with schools about the ethical implications of conducting both research and evidence-informed practice.


Carnwell, R. (2001). Essential differences between research and evidence-based practice. Nurse Researcher8(2), 55-68.

Monday 23 March 2015

The School Research Lead - Supporting teachers to improve not prove

At the recent Research Leads Conference held in Cambridge, there was a discussion about Stokes's (1997) notion of Pasteur's Quadrant. In particular, there was agreement that the role of the school research lead was to encourage colleagues to engage in Yes/Yes research activity, i.e. use-inspired basic research, and the creation of new and generalisable knowledge. In this post I argue this stance is both philosophically and practically mistaken, but rather the focus should be on No/No activity, which primarily develops individual colleague's skills as researchers. Furthermore, the Yes/Yes stance could have significant negative unintended consequences for the desired development of evidence-informed cultures within schools. First, let's examine Stokes's model of scientific research.

Stokes's Quadrant Model of Scientific Research

At September 2014's researchED conference, Dylan Wiliam gave a fascinating explanation of why education will never be a research-based profession. In this presentation Wiliam made reference to Stokes's (1997) notion of Pasteur's Quadrant. Stoke's developed his quadrant model of scientific research as an attempt to end, in his view the unhelpful separation between basic and applied research, which he argued made it more difficult to think about the relationship between scientific discovery and improvement. In order, to challenge this dichotomy 'pure' and 'applied' research Stokes developed a 2 x 2 table by which to classify research, and is illustrated in Table 1.

The upper left-hand cell is contains research which is driven purely by understanding for its' intrinsic values, for example, Bohr's work on the atomic structure. The upper right hand cell captures research which is undertaken to for both understand and use, the search for understanding is driven by the need to solve some very specific problems, for example, Pasteur's work on microbiology and pastuerisation. The bottom right hand cell focuses on research which has very specific applied goals, without necessarily seeking to gain engagement and understanding of the basic phenomena which are being 'exploited'. Edison's work in the commercial development telecommunications and electrical lighting sits neatly into this cell. (all examples are cited by Stokes).

However, Stokes argues that this does not mean that the bottom left-hand cell is empty. This cell may involve exploration of particular phenomena where there is no attempt to understand the issue being studied or even apply the findings. Examples, of this can be found where data is being classified and presented, without any subsequent discussion about the implications for action of such data. As such, this type of research is often driven by the intrinsic curiosity of the researcher. Furthermore, this cell is not just limited to 'curiosity,' research activity in this quadrant may involve the researcher developing their own skills as a researcher, rather than seeking to create new knowledge and innovations.

Implications for School Research Leads and Teacher Researchers

For School Research Leads and Teachers there are three main implications of Stoke's model of scientific research.

1. If you accept Flyvberg's argument that social sciences should give up on 'physics-envy', then focussing the work of the school research lead on supporting the development of Yes/Yes activity is probably misguided. It is misguided in a far as: one, it is not achievable; two, the vast majority of colleagues do not have the appropriate skills set for such activity; three, nor do they have the time.

2. If the focus is on applied research, then the tools and techniques of evidence-based practice should form an essential component of the School Research Lead. At the core of evidence-based practice is the idea that decisions are informed, not determined, by the best available evidence. The focus of evidence-based practice activity is on bringing about improvements in decision-making, which subsequently provides for the benefits of pupils and colleagues.  Evidence-based practice is not about the creation of 'new' generalisable knowledge, which is applicable outside of the current-setting.  Furthermore,  there is an emphasis on accessing evidence from a number of sources, not just 'scientific' academic research. School generated evidence is essential, alongside professional judgement and the views of stakeholders.

3. Teachers should focus on developing their skills as evidence-informed practitioners, be it by critiquing research or engaging in evidence-based practice, with the School Research Lead being one source of support. Indeed, to go back to the beginning of this blog and Dylan Wiliam's presentation, the challenge is for teachers throughout their careers to improve practice through the process of 'disciplined inquiry'.

In many ways the core message of this blog is as follows: do simple things well; ask good questions; make the most of the current best available evidence; develop your skills incrementally; apply what you find out to improve your practice; and, evaluate what you do. In doing so, you will end-up having a far more interesting and enjoyable career. More importantly, it will make positive differences to the lives of pupils and colleagues. Remember, you are teachers wishing to improve, rather than researchers seeking to prove.


Flyvberg. B. (2001). Making Social Science Matter : Why social inquiry fails and how it can succeed again, Cambridge University Press, Cambridge/

Stokes, D.E. (1997). Pasteur's Quadrant : Basic Science and Technological Innovation, Brookings Institution Press, Washington D.C.

Saturday 14 March 2015

Research Leads Cambridge and What's Love Got to do With It - CUPID and the classification of research activity

On Saturday I attended another great session put on by the research ED dynamic duo of Tom Bennett and Helene Galdin-O’Shea.  There were several great speakers, each of whom made presentations full of opportunities for learning. 
  • Philippa Cordingley told us about whole school approaches to using research within individual school, along with evidence about what works.
  • Clare Hood, Abi Thurgood-Buss and Bethan Morgan shared their perspectives on leading research in schools, supported by the SUPER Partnership and Cambridge University.
  • Vincent Lien made a compelling, and entertaining case for teachers to have free access to research journals.
Unfortunately, due to travel arrangements, I was not able to hear what I’m sure would have stimulating presentations from Caroline Creaby, Jonathan Sharples,  Ffion Eaton and Robert Loe.

Fortunately for me, though probably not for the audience, prior to departure I was able to to squeeze in my own presentation entitled – The School Research Lead and Star Trek:  to boldly go where others have gone before or what the School Research Lead is not Captain Kirk.  The rest of this post will focus one key issue from the presentation: how do you classify research activities. In doing so, I will be aided by the recent BERA-RSA Report on Research and the Teaching Profession. I will then proceed to classify different research activities to help School Research Leads develop a meaningful research agenda for their own school. As such, a sensible place to start is BERA-RSA's definition of the term research 

What do we mean by the term 'research'? 

The BERA-RSA inquiry intentionally devised a general and comprehensive definition of research.

By research, the report’s authors mean any deliberate investigation that is carried out with a view to learning more about a particular educational issue. This might take a variety of forms and be concerned with a range of issues, for example: the secondary analysis of published data on school exclusions, interviewing a range of colleagues about examination performance in the English Department, taking part in a national Randomized Control Trial concerned with the teaching of Mathematics, responding to a survey about teachers’ use of the internet to inform curriculum planning, working with a university department of education on a study into teachers’ use of new technology. 

Setting aside issues arising from the inclusive and wide-ranging nature of this definition of research, there is value in classifying the activities encompassed within the definition. Accordingly, Table 1 classifies different 'research' activities in the following five categories: consume, use, produce, involvement and disseminate .

Table 1 – Classification of Research Activities – The CUPID Model 

Consume –  involves reading texts, searching the Internet, with a critical focus and includes activities such as, the production of critical synopses of texts or discussions at regular Journal Club meetings. Most importantly, this is a state of mind of seeking out research and subsequently engaging in a meaningful and thoughtful critique.

Use –  is where evidence-based practice comes to the fore – where colleagues use the best available evidence from a variety of sources – academic, school, pupils and other stakeholders alongside professional experience – to make judgements to bring about changes in practice for the benefit of pupils, colleagues, and other stakeholders.

Produce – this could involve undertaking action research projects, supported experiments or enrolling on masters or doctoral degree programmes. Alternatively, the collation and presentation of data is also included in this section. However, the main purpose of ‘producing’ research is not the generation of new knowledge, but rather developing the capacity of colleagues to engage in ‘disciplined inquiry’, a topic I will return to in a future post.

Involvement  - this is when a school or individual colleagues engage in a research project either in conjunction with a higher education institution, be a site for a randomised control trial or answer survey requests from various bodies engaged in their own research. This also includes - department, school or college-wide activities designed to look at institution wide issues – where some colleagues are participants rather than active researchers.

Disseminate– an essential part of the research process as evidence, data analysis, discussion and recommendations need to be subject to a process of peer review. This does not mean the ‘research outcomes’ need to be published, but rather there is an open-ness and transparency with allows colleagues and others to explore the assumptions underpinning the research. It is necessary to check- for confirmation bias, and whether the researcher its over-claiming the reliability and validity of the findings.

All classifications, particularly those based on 2x2 boxes, have their limitations. Indeed, the CUPID model does not address the ‘process’ issues associated with creating a research and evidence-based culture (thanks to Lisa Pettifer @Lisa7Pettifer for drawing this to my attention). With that in mind, a future post will explore the process issues associated with creating a research-based school culure.

Nevertheless, the CUPID model provides a simple mechanism for thinking about the range of activities associated with research. If, by allowing colleagues to identify activities fit for their context and school, the model makes research activity more likely to be undertaken, then CUPID will have more than served its purpose.
Finally, for those of you who wish to wish to access today's presentation I am sure this will soon be made available on the researchED website.

Next week I will revisit the PICO question format, and the discussions which took place on this topic.

Monday 9 March 2015

The School Research Lead and Developing Critical Analysis

In a recent post I wrote about the work of Wallace and Wray (2011) who produced a range of resources to support both critical reading and writing.  In particular, I shared their 5 question structure for creating a critical synopsis of a text, which  I suggested could be used as a development activity for a Journal/Book Club. In this post, we will expand upon the work of Wallace and Wray and consider 10 key questions for use in critical analysis.  But first let's recap the 5 questions suggested for producing a critical synopsis of a text

A  Why am I reading this?
B  What are the authors trying to do in writing this?
C  What are the authors saying that is relevant to what I want to find out?
D  How convincing is what the authors are saying?
E  What can I make of this?
Ten Critical Analysis Question(s)
Wallace and Wray suggest 10 further questions to engage in a much more critical and evaluative analysis of the text.
  1. What review questions am I asking of this text?
  2. What type of literature is this?
  3. What sort of intellectual project is being undertaken?
  4. What is being claimed that is relevant to answering my review questions
  5. To what extent is there backing for claims?
  6. How adequately does any theoretical orientation support claims?
  7. To what extent does any value stance affect claims?
  8. To what extent are claims supported or challenged by others' work?
  9. To what extend are the claims consistent with my personal experiences?
  10. What is summary evaluation of the text in relation to my review question (Wallace and Wray, 2011 p 109)
The 10 questions can then be mapped against the 5 questions used to create to create the critical synopsis and this allows us to generate a far  richer and more critical analysis of the text.

A Why am I reading this?
1. What review question am I asking of this literature?

B What are the authors trying to do in writing this? 
2. What type of literature is this?
3. What kind of intellectual project is being undertaken?

C. What are the authors saying that’s relevant to what I want to find out?
4. What is being claimed that is relevant to answering my review question?

D How convincing is what the authors are saying?

5. How far is there backing for claims?
6. How adequate is any conceptual or theoretical orientation to back claims?
7. How far does any value stance adopted affect claims?
8. How far are claims supported or challenged by others' work?
9. How far are claims consistent with my experience?

E In conclusion, what use can I make of this?
10. What is my overall evaluation of this literature in the light of my review question?

As such, the major change is in the level of detailed critical analysis relating to questions D – changing from one general evaluative question in the critical synopsis, with little or indication of what to look for, to five questions you could be checking the text for if you do the full critical analysis. 

Wallace and Wray then go onto identify a range of sub-questions for each of our 10 critical analysis questions, for example, for question 1 - 'what review questions am I asking of this text' - has the following subquestions:
  • What is my central question?
  • Why select this text?
  • Does the critical analysis fit in with my investigations with a wider focus?
  • What is my constructive purpose ( Wallace and Wray, p237)
Furthermore, in order to help colleagues develop their skills levels , Wallace and Wray have produced a template for conducting a critical analysis which can be found using this link. In addition, they have produced a number of work examples of critical analyses.

However, what I hope becomes clear from both this post and my previous post is the need to have a well-formulated and answerable question, as this will provide a clear framework and starting point for any critical analysis.   Accordingly, these resources provide an extremely useful starting point to support School Research Leads in their task developing colleagues's skills as critical consumers of research.   Indeed, by jointly undertaking critical analysis of key texts - say in a Journal Club or through personal reading - this will not only build capacity in consuming research but will also make it more likely that formal academic research can be put to good use.   This type of work may not be as exciting as initially conducting school research projects, though it will lay the foundation for much more productive future school evidence-based activity.

In a future post I will be building on the both 'critical synopses' and critical analysis' of texts  to explore is meant by a Critically Appraised Topic.

Wallace, M. and Wray, A (2011) Critical Reading and Writing for Postgraduates (2nd edition), Sage, London

Declaration of interest
Mike Wallace was my doctoral supervisor and remains a  close personal friend and has also contributed to this blog post.

Monday 2 March 2015

The School Research Leads and Educational Prescriptions - a tool to support the development of evidence-based practice

As noted in previous posts, asking well-formulated questions is fundamental to effective evidence-based practice.  In supporting the development of a school cultures which have the effective use of research and evidence at their core, school research leads need to engage in processes which identify opportunities for generating well-formulated questions, with the answering of such question supporting the needs of individuals or groups of students.  In evidence-based medicine Strauss et al (2011) have identified a number of ways in which asking of good questions and in this post I will be looking at the use of educational prescriptions in aiding the development of evidence-based practitioners.

What is an Educational Prescription?

One of the challenges in developing evidence-based practice is to ensure that naturally occurring opportunities for learning are taken advantage of.  In this instance, a School Research Lead or other member of staff - may identify a particular problem or issue to be addressed - but there is a danger that both the question and answer are lost in the day to day business of the school.  One way of capturing the 'question' is through the use of an educational prescription which records the question, who is responsible for developing the answer and the deadline for the completion of the task. Accordingly, an educational prescription consists of the following:
  • A specification of the learning problem that generated the question.
  • A re-statement of statement of  the problem, as a well formulated and answerable question.
  • A clear statement as to who is responsible for answering the question.
  • A deadline for answering  the questions - bearing in mind the learning needs of students or staff (taking into account the urgency of the clinical problem that generated it).
  • A clear articulation of the steps involved in answering the question. (adapted from Strauss et al 2011, page 23)
Possible Educational Prescription Template

Pupil/Class/Year Group :

Member of staff:

Problem : How would you describe the situation?

Intervention : What are you planning to do with your pupils/class/year group?

Comparison. What is the alternative to the intervention (e.g.different intervention)?

Outcome : What are the anticipated effects of the intervention?

When will educational prescription be followed up?

Check-list prior to future discussions:
  • How did you go about searching for relevant evidence?
  • How did you take into account pupil/student views?
  • What were the outcomes of your search strategy?
  • How valid and reliable is the evidence you found?
  • Can this evidence be applied to the current problem?
  • How useful was this process ?

Adapted from Straus, S.E., Glasziou, P., Richardson, W. S. & Haynes, B.R. (2011)  Evidence Based Medicine : How to practice and teach it, (4th edition), Churchill Livingston page 24 

How can  Educational Prescriptions be used?

Depending upon where the school is the development of an evidence and research-informed culture these educational prescriptions could be used in a number of differing settings: be it case-reviews of individual students, departmental meetings, or end of year review and evaluations.   Alternatively, they may be used by individual members of staff as a presentation device in order to share with colleagues the essence of the approach.

Tips for using Educational Prescriptions
  • Produce 2 or more copies - one for the School Research Lead - and other for those staff who undertaking the review
  • If connected to a local higher educational institution- use the prescription to engage in meaningful dialogue with the HEI librarians
  • Try and use them as part of day-to-day teaching and learning rather than one-off events.
  • In a Journal Club or other School Research Leads sessions ask colleagues to write out educational prescriptions to help develop their skills in developing well-formaulated questions.
And finally
There are no short-cuts in the developing the skills in developing and asking well-formulated questions.  It will invariably take time and both the school research leads and colleagues will no doubt make mistakes.  However, it is that very process of making mistakes and learning in the process which enhance colleagues skills and expertise at being evidence-based practitioners.

Straus, S.E., Glasziou, P., Richardson, W. S. & Haynes, B.R. (2011)  Evidence Based Medicine : How to practice and teach it, (4th edition), Churchill Livingston