Acknowledging the influence and impact of cognitive biases on evidence-informed practice is essential for the School Research Lead. Why? Well, evidence-informed practice is built on the notion of teachers using their judgment to make decisions which directly or indirectly benefit their pupils and colleagues. If those decisions result from systematic errors in judgement due to cognitive limitation, motivational factors and the environment, those decisions are less likely to bring about favourable outcomes. In this post we will examine the work of Daniel Kahneman and other to help us gain a better understanding of cognitive biases. Finally, we will examine what strategies can be adopted to mitigate the impact of cognitive biases, whilst at the same time develop our expertise as evidence-informed practitioners.
What do we mean by cognitive biases?
Amos Tversky and Daniel Kahneman introduced the term 'cognitive bias.' It described systematic errors in judgment and decision-making which can be due to cognitive limitations, motivation factors and the environment. Recently Geoff Petty posted a blog which looked at the impact of confirmation bias on evidence-based practice. Nevertheless, there is a wide range of cognitive biases, other than confirmation bias, which can impact upon evidence-informed practice, and are summarised by Wilke and Mata (2012).
Let's now look at one of the cognitive biases - the Halo Effect - in more detail.Amos Tversky and Daniel Kahneman introduced the term 'cognitive bias.' It described systematic errors in judgment and decision-making which can be due to cognitive limitations, motivation factors and the environment. Recently Geoff Petty posted a blog which looked at the impact of confirmation bias on evidence-based practice. Nevertheless, there is a wide range of cognitive biases, other than confirmation bias, which can impact upon evidence-informed practice, and are summarised by Wilke and Mata (2012).
- Confirmation bias - the tendency to selectively search or interpret information in a way that confirms your perceptions or hypotheses.
- Conjunction fallacy - the tendency to assume that specific conditions are more probable than a single general one.
- Endowment effect - the tendency that people often give more value to on an object they already have than they would be pay to acquire it.
- Fundamental attribution error - the tendency to over-emphasize personal factors and underestimate situational factors when explaining other people's behaviour
- Gambler's fallacy - the tendency to think that future probabilities are changed by past events.
- Halo effect - the tendency for a person's positive or negative traits to extend from one are of their personality to others' perceptions of them.
- Hindsight bias - a memory distortion phenomenon due to the benefit of feedback about the outcome of an event, people's recalled judgment of the likelihood of that event are typically closers to the actual outcome that their original judgments were.
- Illusory correlations - the tendency to identify a correlation between a certain type of action and effect when no such correlation exists.
- In-group bias - the tendency for people to give preferential treatment to others they perceive to be members of their own group.
- Mere exposure effect - the tendency by which people develop a preference for things merely because they are familiar with them.
Cognitive bias and the Halo Effect
Phil Rosenzweig in his book The Halo Effect .... and the eight other business delusions that deceive managers describes the Halo Effect as when, for example, a company's sales and profits are up (a school's retention, achievement and success rates) people (inspectors or significant other stakeholders) may conclude that this is the results of brilliant leadership and strategy or a strong and coherent corporate (school) culture. When performance falters, (success rates or position in leagues tables fall) people conclude it is the result of weak leadership and management (the Headteacher/ Principal), and the company was arrogant or complacent (the school was coasting). Whereas the reality may be that little may has changed and that the school/company performance creates a HALO. This shapes the way judgements are made about outcomes for learners, teaching learning and assessment, leadership and management.
What can we do - a check-list for routing out cognitive biases
At the core of evidence-informed practice is the notion of making decisions within schools through the conscientious, explicit, judicious and skilful use of teacher expertise, school evidence, research evidence and values and preferences of stakeholders. It should be self-evident that 'cognitive bias' is a real-threat to the making of decisions which lead to favourable outcomes
Kahneman, Lavallo and Sibony (2011) provide guidance on how to find dangerous biases before they lead to poor-decision-making. Accordingly, Kahneman et al suggest you apply the following check-list.
- Is there any reason to suspect that the team or individual making the recommendation are making errors motivated by self-interest? If so, review the proposal with care and challenge any underpinning assumptions and evidence. What are the proposers going to 'get out' of the proposal?
- Has the team fallen in love with its proposals? Use this check-list to test the thinking and evidence which lie behind the proposals
- Were there dissenting opinions, were these opinions fully explored? Is the team victim of 'group-think'? If so, discretely seek out dissenting opinions or other sources of evidence
- Could the diagnosis be overly influenced by an analogy to a memorable success? Has something similar, but different, worked in the past? For example, remember when we did this, as that worked, so let's do the same
- Are credible alternatives included with the recommendation? Have alternatives ways of accessing evidence been considered? Have different ways of conducting study been considered, be it quantitative and or qualitative?
- If this decision was to be made again in a year’s time, what information would you want and can you get more of it now? Is an internal source of evidence available to help inform the decision, have interim reports of national studies been published?
- Do you know where the numbers came from – are there unsubstantiated numbers – have they been extrapolated from historical data? Have assumptions been made about the direction of cause and effect, which do not stand up to critical scrutiny?
- Is the team assuming that a person, organisation or innovation which is successful in one area will be just as successful in another? Just because a particular approach worked in say, the science department, does that mean it will work in art or a humanities subject.
- Is the base case overly optimistic? Have the proposers assumed there will be no delays or barriers to implementation? Most planners underestimate the resources, especially, needed to complete a project.
- Is the worst case bad enough? Has a pre-mortem been conducted? Imagine that the project failed, then think back as to what could have contributed to the failure.
- Is the recommending team overly cautious? Have the proposers under-estimated the potential benefits? Have they down-played some of the positive indirect benefits?
Initially, four implications come to mind when thinking about how to minimise the risk of cognitive biases when engaging in evidence-informed practice within a school. First, make sure the check-list is applied. It will not get rid of all the potential biases, though it will be a start. Second, use all of the check-list. Do not cherry pick those bits of the check-list which 'suit' your current stance or position. Three, separate the recommenders from the decision-makers in the decision-making process. Proposed research projects need to be scrutinised by others, especially for ethical considerations, before approval is given Finally, remember getting decisions right is an art not a science. You won't be able to absolutely prove a research proposal is a good idea, but what you can do is increase your chance of success, be aware of the odds and manage the invariable trade-offs.
As for next week, I will be posting from researchED New York, so hope to have some fascinating news, ideas and perspectives to share.
References
Kahneman, D., Lovallo, D and Sibony, O. (2011) Before you make that big decision ... Harvard Business Review, June 2011
Rosenzweig, P. (2007) The Halo Effect .... and eight other business delusions that deceive managers, Free Press, London.
Wilke A. and Mata R. (2012) Cognitive Bias. In: V.S. Ramachandran (ed.) The Encyclopedia of Human Behavior, vol. 1, pp. 531-535. Academic Press.