A couple of years ago, I wrote about cognitive bias, as part of a series of articles that arose from our research into managing the agile / remote workforce. Since that time, we have learned much about neuroscience and the way the brain works, which helps to explain at least in part why we are so susceptible to biases.
Why does bias exist?
Put simply, if you have a brain, you’re biased. Biases are one way that the brain saves energy – and since the brain is designed to keep us safe, it’s very focused on conserving as much energy as possible, so it has sufficient for times when we are under threat and need to act to get ourselves out of danger.
The brain uses prediction to determine whether we need to spend time and energy responding to any situation. If we have experienced a situation before, the brain predicts that the same outcome will happen, and doesn’t seek to prolong the task. Biases fall within the realm of predictions, associations, stereotypes and habits – that all seek to avoid the needless consumption of energy.
Are all biases bad?
If bias is part of our brain’s way of conserving energy, then maybe not all are “bad” per se. There are some circumstances where an automatic judgement based on our memory of past events can be very helpful – such as knowing when the fire alarm goes off, it’s time to leave a burning building, or when we skid on black ice, we instinctively know how to react – we don’t want to spend precious moments gathering more information before making a decision!
That said, what is clear is that we can get hijacked by adopting mental shortcuts – i.e. by relying on memory, experience or intuition. This can damage our decision making and in turn our relationships, when we fail to gather all the information we need before making a decision or forming a judgement about others. One short cut is that we tend to look for ways to confirm our existing beliefs (it’s quicker, easier and consumes less energy) and this can certainly happen to people who work apart and have less easily available information upon which to make decisions about others.
Our research into managing the agile workforce shows that working remotely opens us up to attribution and availability biases, unless together we take steps to correct them, such as gathering several sources of data on a colleague’s situation or performance before making a judgement about them.
Bias in decision making
Those that have studied bias have concluded that it is very difficult for us to recognise when we are being biased – i.e. at the moment when we fall back on our intuition, fail to seek additional information etc. We are particularly susceptible to bias when we are tired, stressed or trying to multi-task.
Under these circumstances we cut corners, we save energy by falling back on established practices and beliefs – essentially relying on ourselves. This might not be a disaster in every situation, depending upon what is at stake in terms of the importance of the decision / judgement. For example, if there is a lot of money in play, or the outcome of the decision may affect your situation / business / organisation for a long time, then it’s probably not a good idea to make a decision when tired or stressed. But with a decision that will only affect you over the next 24 hours, maybe it’s ok to go with your gut feeling!
I am certainly not advocating that we obsess over every decision, but I am suggesting that more awareness might be helpful, particularly if we can mitigate against bias in situations that are important. During a recent webinar, I asked attendees to choose from a list of factors that are important for team performance (see diagram). I was surprised that nobody identified bias, and that may be because people think they aren’t biased, or that even if they aren’t there’s nothing they can do about their biases. Whether this indicates a general lack of understanding about bias, I’m not sure, but I do know that even mentioning the word bias can cause discomfort and people often become defensive at the suggestion that they may be biased.
Another thing to be mindful of is that just teaching people about bias isn’t sufficient to stop them falling foul of them, so awareness alone isn’t sufficient.
Strategies for mitigating bias
Given that we aren’t great at spotting bias in ourselves, and that we live in a world where we are encouraged to rely on our own resources, there is a danger that we will make biased decisions and judgements, but there are also many opportunities to avoid this through working closely with others and setting little tripwires or traps for our biases.
The NeuroLeadership Institute have suggested a number of “nudges” which can be deployed. These rely on giving thought to the potential biases that might come into play, and planning for them. One simple way is to predetermine how much emphasis a particular piece of information will form in the final decision or judgement that is to be made. Another one is to determine some “if/then” scenarios which will be followed in the event that we do catch ourselves (or someone else catches us) behaving in a certain way during a decision making or change management process. Change invokes a lot of established, potentially biased, habitual behaviours which again save time and energy but don’t always serve us well.
Other ideas are about trying to remove certain cues that lead us into biased behaviour – for example by removing the names from CVs to avoid gender bias in recruitment. We can also set parameters such as the makeup of a decision-making group – perhaps we set proportions based on gender or race to support diverse thinking and the inclusion of different perspectives. The more we can build in bias “interrupters” or traps within our processes, and the more we make this a group responsibility, the easier it will be to avoid falling into those traps, and placing too much reliance on one person’s view (experience bias), what we know has worked before (familiarity bias) and so on. For those involved in coaching and mentoring, including peer to peer coaching, this is an opportunity to be on the watch for bias, and to find ways to tactfully question the evidence supporting the positions taken by others.
A final thought is about looking to the future and considering different outcomes. Often, we don’t want to spend time and energy focusing on uncertainties, so we commit to one possible future, rather than thinking or considering things more broadly. Given that we aren’t so great at stopping ourselves from doing this, perhaps we need to build different approaches into our processes and decision making. If the process encourages us to agree a decision but then not action it until (for example) a week later, when we revisit the decision and see if there are other perspectives, this may enable different angles to be considered. This is akin to “sleeping on it”. Being able to put distance between the thinking that led to the first decision or forecast would enable a fresh perspective, particularly if we are able to even ignore or really set aside the original thinking – perhaps by simply assuming it was wrong. Assuming something is wrong or holding a “pre-mortem” also enables us to be more realistic about different outcomes – pulling us away from potential biased thinking.
As with biases, mitigation strategies need to become habitual. As creatures wired to conserve energy (and recognising that these strategies take more energy), we definitely need something in place to force us to do the right thing (or the different thing). To take time. To entertain different perspectives. To be open to other possibilities. To acknowledge that our thinking may be biased and too reliant on gut feeling and instinct. The ability to put these bias mitigations into place calls for not only personal commitment, but potentially some workplace culture change, depending upon how openly people feel able to speak and to respectfully challenge each other and the status quo.
What will you do today to tackle the unconscious bias in your decision-making and judgement of others?