Interventions: How well do you know your cognitive bias?

Written by: Fiona Aubrey-Smith | Published:
Image: Adobe Stock

Cognitive bias can present itself in many forms and can hamper your school’s work to raise attainment and close the gaps. Fiona Aubrey-Smith asks whether cognitive bias is holding you, your colleagues or your school back

We are seeing a wonderful shift as education becomes an increasingly evidence-informed profession. Thanks to research and resources such as the Education Endowment Foundation’s Teaching and Learning Toolkit and Professor John Hattie’s Visible Learning rankings (2009), we are more informed than ever before about what works and what does not.

But how often have you spent hours searching for solutions to problems that your school faces – and countless staff meetings and twilights working on something – only to find it has not led to the impact that you had hoped for? Despite access to robust and comprehensive evidence sources, we are still seeing inefficiencies within our decision-making. So why is this?

It is all about cognitive bias. In other words – what is going on inside our own heads affecting how we look at the world around us. All of us look at the world through a specific lens which changes depending on the context we are in and who else is involved – if you are interested in how we come to form those lenses, see my recent article for Headteacher Update (2018).

The list below – which has been adapted from Hattie and Hamilton (2019) – introduces you to the main types of bias that we have.

As an example, let us take something called Anecdotal Fallacy – which is the tendency to take anecdotal information (e.g. other schools recommending a strategy or resource) at face value and give it the same status as more rigorous data when making judgements about effectiveness. We have all experienced this – hearing how successful something was across our cluster, federation or local authority. We do not want to be the one not adopting something that clearly sounds like it works, so we buy-in and start doing it in our own school.

However, this misses the important step of asking why it worked elsewhere (or could this be like a runaway train with other schools adopting it and not wanting to be seen as the one school where it did not work?). Are the schools where the strategy worked similar enough to our own school to make a justifiable comparison about its likely impact? Most importantly, what does the evidence – impartial, objective, robust, longitudinal evidence – say?

Cognitive bias is an emerging field within education and an absolutely vital one to understand if we are to have greater impact, both as individual professionals but also through our collective efficacy – working together to have greater positive impact on children’s learning.

So here is an easy (and probably entertaining!) way to get started. Sit with your senior leadership team and talk through the types of bias in the list opposite. Be honest with each other and share examples where you think this has happened within your own school – perhaps collectively, perhaps individually, perhaps with specific colleagues.

Then reflect on the suggestions below about how you might reduce or avoid such bias in the future. It is important to be really honest with yourself, however uncomfortable it might at first feel. Only by recognising these biases and addressing them will we break free of the limitations that they place on our professional practice and on the impact that we could have on our children’s learning. In this list below, the type of bias is in italic text and is followed by a description of what it is and then advice on what we might do to avoid it.

Authority Bias: The tendency to attribute greater weight and accuracy to the opinions of an authority or well-known figure – irrespective of whether this is deserved (how many conference presenters or Twitter folk have you quoted without checking the accuracy of their claims?). To tackle this, try not to be swayed by famous, titled gurus. Carefully unpack and test all of their assumptions, especially if they are making claims outside their specific area of expertise.

Confirmation Bias: The tendency to collect and interpret information in a way that conforms with, rather than opposes, our existing beliefs (how much of your monitoring confirms your existing predictions rather than viewing them objectively and critically?). To tackle this, be prepared to go against the grain, and to question sacred assumptions. Remember, we tend to select education approaches, products and services that accord with our worldview, and we will often continue to believe in them, even when convincing evidence is presented that our worldview may be distorted.

Ostrich Effect: The tendency to avoid monitoring information that might give psychological discomfort (are you avoiding certain tasks because they are difficult or daunting to tackle?). To tackle this, collect robust and regular data from a range of sources about the implementation of new interventions and analyse this data ruthlessly – involve colleagues who have contrasting opinions to your own.

Anecdotal Fallacy: The tendency to take anecdotal information (e.g. other schools recommending a strategy or resource) at face value and give it the same status as more rigorous data in making judgements about effectiveness. To tackle this, probe deeply into why strategies, resources or products worked – where is the evidence and is that evidence robust, impartial and objective?

Halo Effect: The tendency to generalise from limited experiences about an individual person, company or product – assuming everything they do/offer is just as good. To tackle this, remember that everyone has strengths and weaknesses. An expert in one area will not be expert in everything. For each area of challenge, search critically for what the evidence shows works and who the genuine experts are.

But our school is different: The tendency to avoid using a tried and tested solution which evidence shows works simply because it was used or created elsewhere, claiming “but we are different here...” To tackle this, remember that we have more in common than that which divides us. Do not reinvent the wheel – adapt or adopt what evidence shows works so that it then also works for your school.

IKEA Effect: The tendency to have greater buy-in to a solution where the end user is directly involved in building or localising the strategy, product or service. To tackle this, remember that we all feel greater ownership when we are involved in the creation or adaptation of something. So channel this energy into personalising solutions that evidence shows works.

Jumping on the bandwagon: The tendency to believe that something is good because a large number of other people believe it is good. To tackle this, simply remember that it might work or it might not. Ask those on the bandwagon to point you to robust evidence. And check: what does the evidence actually say?

Cherry-picking: The tendency to remember or over-emphasise pockets of positive or negative data within larger sets of more random data (i.e. seeing phantom patterns). Why are you subconsciously doing that? To tackle this, look at trends over time, or trends across groups of children/schools. Do you have access to the whole dataset and does the rest of the data broadly agree with or support these trends?

Law of the Instrument: The tendency to only address problems for which you already have a potential solution – if you have a hammer, everything looks like a nail. To tackle this, identify the problems that need to be solved, then search for solutions, rather than searching for problems to which you already have a solution.

Courtesy Bias: The tendency to give an opinion that is more socially palatable than our true beliefs – are you contributing to an echo chamber in your own school?. To tackle this, remember that human beings are never objective; we all look at things differently. Embrace this and use it as a way to open discussion, debate and robust conversation about what the evidence really says to all those involved.

Easy Task Blinkers: The tendency to avoid complex projects, focusing instead on projects that are simple and easy to grasp by a majority of people. To tackle this, divert the time and energy spent on all those little projects and focus it all on fewer, bigger, more impactful projects. Go for quality strategies not quantity of actions.

Sunken Cost Fallacy: The tendency to continue with a project that is not bearing fruit simply because so much has been invested in it already. To tackle this, for any investment – of time or money – ensure early milestones offer genuine review points that include the option to stop. Even if money has been committed, future time has not been – and can be redirected somewhere else more impactful.

  • Fiona Aubrey-Smith is a former school leader and now Doctoral researcher and consultant. She sits on several educational charity boards and facilitates a number of national networks.

Further information & resources


Comments
Name
 
Email
 
Comments
 

Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
 
Sign up SecEd Bulletin