Whatever we like to think, we are all biased. Many of our unconscious biases come from natural psychological processes, which have evolved to help us cope with complexity and, in earlier times, to survive. There is too much information around us to take everything in, so our brains select what seems most relevant.
Much of the time this strategy is successful. In the supermarket searching quickly for your favourite brand of baked beans, you look for the right colour tin in the right location. Usually that works and saves time. Until one day you buy soup by mistake because the shelves have been reorganised. As you look at the tin marked "soup" you are amazed that you did not notice.
Our brains have many rules, known as heuristics, to filter the mass of information available to us all the time and to fill in the gaps when some of it is missing. Most of the time, heuristics are helpful; but they can be the opposite if we don't monitor them when we need a detailed, accurate picture of a set of circumstances -- during an investigation, for example.
The set-up
Now you knowSafety scientist and academic Sidney Dekker complains in his book, Just Culture (bit.ly/2n1HBVV), that hindsight oversimplifies causality and overestimates the ability of other people to have foreseen the results of their actions. We know what happened, so surely they could have seen it! Dekker explains: "While there is always a gap between written guidance and actual practice (and this almost never leads to trouble), that gap takes on causal significance once we have a bad outcome to look at and reason back from." The problem with defining an action or event as a cause (root, underlying or immediate) is that it ignores the fact that this may have occurred many other times but had not led to a problem. David Ramsay prefers an alternative to immediate cause: "'Active failure' leads you to consider what activity or activities occurred to enable the incident to happen." For example, a person standing underneath a dropped load did not cause the dropped load, but it was an active failure without which the person would not have been harmed. Andrew Hale, who has 40 years' experience of industrial accident research and was professor of safety science at Delft University of Technology in the Netherlands until 2009, says: "The biggest investigator bias is hindsight. From a position after the accident the decisions taken may look so obviously wrong that it is hard to conceive how they were taken. What investigators need to do is learn to put themselves in the place of the decision-maker and keep probing until they can say, 'If I had been in that person's place, I would have done the same as they did'. Then you really understand the actions." |
Our brains naturally look for a starting point for an opinion. If you are haggling for a trinket at a tourist market, the seller's opening offer will set your expectation for the price that might be negotiated. Similarly, the way a question is asked sets an expectation of the anticipated answer. This heuristic is called anchoring or framing. If the initial investigation question was, "Who is to blame for this accident?" the process would clearly be driven by finding someone at fault. However, even more subtle questions such as, "What went wrong before the accident?" implies that a fault must be found. A question such as, "What were the circumstances leading up to the accident?" provides a framework for asking about a range of conditions and actions.
The UK Health and Safety Executive's (HSE) guidance Investigating Accidents and Incidents (HSG 245) (bit.ly/2FBzuar) raises the issue of bias and explains that an investigation "should be thorough and structured to avoid bias and leaping to conclusions".
Our tendency to draw a conclusion quickly from the evidence presented to us stems from our evolution; the successful primitive humans were the ones who could make rapid decisions about whether to run from an animal or kill and eat it.
It may be that an early hypothesis about an accident is fine as long as we continue the process to seek more evidence. But here confirmation bias is a risk. Once we have a theory, it can be difficult to adjust our position. Our natural tendency is to look for more information that confirms our initial hypotheses and to discount information that does not.
David Ramsay is aware of this tendency. He has 30 years' experience of incident investigation in industries including aerospace, oil and gas, explosives, transport and utilities. He has developed the Kelvin TOP-SET incident investigation method.
"We don't allow the word 'obviously' in our company or the expression 'I've seen it before'," he says. "In my experience, there are amazing similarities between incidents, but you have to look deeper to find the differences."
Changing our minds once we have made a choice is difficult, and the principle known as the sunk-cost fallacy helps to explain this. If you have already invested time and effort in crafting one "solution" to the problem, it can feel like failure to abandon the explanation, even though evidence for an alternative is available. One way to overcome this is to involve someone new, perhaps part way through the investigation, presenting only factual evidence.
Backward looking
A criticism of accepted ways of investigating accidents is that they promote hindsight bias, the inclination after an event has occurred to see it as having been predictable.
Approaches such as "five whys" (an interrogative technique designed to reveal fundamental causes) and causal factor analysis (in order to determine immediate, underlying and fundamental causes) make hindsight bias more likely. Why was the worker injured? Because the worker stood beneath the load. Why did the worker stand beneath the load when training had warned them against this? The only answer to that seems to be to blame someone.
How we see people
We don't like to consider ourselves as prejudiced against any group of people, but there is evidence that most people are subject to some unconscious stereotyping. Sometimes these stereotypes can appear to be positive -- that a particular group "tends to be" stronger or less aggressive. But we deal with these underlying views in everyday life. In an investigation we need to acknowledge how they can colour our approach to people and perhaps the weight we give to their evidence.
Why did the worker stand beneath the load when training had warned them against this? The only answer seems to be to blame someone
The focusing illusion occurs when one feature of someone's personality is perceived to have a higher influence on the events than is the case. A worker's manager comments that "he's often late for work" and the investigator interprets this as meaning "he's unreliable".
By contrast, if we see one positive aspect to a person, we are more likely to assume they have others. This is known as the halo effect. Someone who is physically attractive and smartly dressed may be more credible than someone less attractive in dirty overalls. The impact of these preconceptions on weighing evidence provided by management and by workers is obvious.
There is also a more generalised tendency to see other people as more likely to be blameworthy than ourselves; this is called the fundamental attribution error or confidence bias. If you make a mistake when driving, it is easy to blame the situation -- perhaps misleading traffic signs or poor road layout. If someone else makes a mistake, you think that person is either a poor driver, or is deliberately being difficult.
Another effect, actor-observer asymmetry, means that the more different the actor -- the injured party, for example, or another worker -- is from the observer (investigator or interviewee) the more likely the observer is to blame the actor's actions; if the actor is seen as similar, the observer is more likely to look for situational factors.
So when forming an investigation team it is important to have a mix of people -- involving representatives from different levels of the organisation, for example -- to provide more balance. Ramsay is clear that "job backgrounds influence decisions and findings".
Oliver Mellors, a human factors professional working in high-hazard industries, has seen the results of the fundamental attribution error.
"It leads to a flawed investigation and produces remedial actions that do not address underlying causes. For example, investing time and money into behavioural-based safety programmes rather than addressing the environmental factors."
Interviewees might not be consciously changing the truth, but unconsciously we have a tendency to post-rationalise decisions we have made. Choice supportive bias describes the phenomenon that occurs when, having made a decision, we give more positive reasons for making a choice than might have been the case at the time. For example, having made a decision about which of two laptops we buy, we tend to justify it by talking up the benefits of the model we chose and denigrating the features of the one we left behind.
So, if after an accident in a loading bay the investigator asks one of the drivers, "Why did you leave the vehicle there?" the driver might respond: "I believed that to be the safest place to leave it." But the real reason might have been it was the first space the driver saw.
A better way to uncover the real reason is to ask the question in a different way that doesn't relate to an accident. For example, "Where do you normally park?" or "Where did you park this morning?"
Under analysis
All animals learn to form relationships between one thing happening and another -- to look for cause and effect. A red sky at night might generally lead to fair weather the next day, but meteorologists can explain why the fair weather occurs, and it isn't caused by the red sky. Looking for a correlation helps us to predict and respond to our environment. However, if we give it rein when analysing the information collected during an investigation, it can lead to oversimplification of a sequence of events and false or illusory correlations. Stereotyping, halo effects and focusing illusions are examples of illusory correlations -- when we come to think that people from a particular group, or people who dress in a certain way, or who have been unhelpful in one situation will have other characteristics we associate with that group or behaviour.
Illusory correlation can also apply to the broader explanation of an accident. We notice that the same type of accident has occurred three times on successive Thursdays. There may be a hypothesis worth asking some questions about, such as whether the loading bay is busier on Thursdays or whether something happens on Wednesdays that leads to greater fatigue the next day. But without further evidence we should not assume it isn't just a coincidence.
The availability heuristic has a similar effect -- we tend to use explanations for outcomes that are readily available. If the last three times I investigated a ladder accident one of the risk factors was poor siting of the ladder, this explanation will be readily available to me when looking at a new ladder accident. If confirmation bias kicks in, I am likely to stop if I find the same risk factor this time and perhaps miss the fact that the new ladder arrived in an unfit condition for use.
Structured approaches can help to overcome this bias. But the general aim should be to be clear about facts -- and if there are gaps in the facts, document them rather than fill the holes with illusory correlations or ideas from something we heard about recently.
All the answers
The HSE advice in HSG 245 is that to be free from bias the analysis must be carried out "so all the possible causes and consequences of the adverse event are fully considered". In reality, just as we don't carry out detailed risk assessments of "all possible hazards", it is not practicable to consider "all possible causes".
Some sources of advice on cognitive bias suggest that just by making people aware of the possibility of bias they will be able to overcome it. By contrast, other sources argue that training in unconscious bias might have the opposite effect in reinforcing stereotypes and legitimising prejudices.
Dr Linda Bellamy, major hazards and occupational safety human factors consultant, says: "I don't believe that you can avoid hindsight bias just because you know it exists." Drawing on her 30 years of studying how to represent events before and after major accidents, she believes there must be a change of emphasis "away from looking at what failed with respect to the standards, rules and regulations, and towards a broad understanding".
Other practical solutions involve the use of balanced teams. Ramsay gives one example. "A team made up entirely of engineers will immediately see an engineering failure. Involving office staff can bring some balance, and mixed gender teams also help to overcome observer bias." Referring to the Myers-Briggs personality types, Ramsay recommends having a balance between judgers and perceivers. "Introverted, thinking judgers are great for getting things done but not for reflecting and standing back in an investigation."
However, the desire for unanimity in a group can result in "groupthink". For primitive people, it was essential to be part of a group to survive and our brains still show measurable discomfort when our opinion deviates from a group opinion. As a result we have a natural tendency to adjust our opinion to fit in. This discomfort has been compared to physical pain. A classic study by Solomon Asch in the 1950s (bit.ly/1hPP3WU) showed that people will even judge the size of a line on a piece of paper differently if everyone else has a different opinion from them. As for the sunk-cost fallacy, to overcome groupthink, it can be useful to bring in new people to review information part way through.
Bellamy supports this approach. "An independent third party can play devil's advocate, and take a fresh look at the situation," she says.
The newcomers must be prepared to challenge the investigators' thinking and look to see where gaps have been filled inadvertently with bias rather than information.
Another way to reduce groupthink is to ask people to compose their conclusions separately before they compare notes and consult each other.
It is important to remember that these biases are mostly unconscious -- we don't set out to be prejudiced against any group of people, or to jump to conclusions, or to ignore information that contradicts our existing ideas. However, our natural way to deal with the world is to take shortcuts in receiving and processing information. If simple awareness of bias is not enough, the best way to overcome our natural habits may be to work as a team and bring in critical colleagues.