The book is taught in business schools around the world. In the mid-1970s, Kahneman changed the way we thought about thinking. With his friend Amos Tversky, Kahneman explained that the brain creates cognitive shortcuts to resolve problems. He defined these "heuristics" as simple procedures that help to "find adequate, though often imperfect, answers to difficult questions".
Kahneman outlines the System 1 (fast) and System 2 (slow) thinking modes of the human brain and explains how we use heuristics to aid decision-making.
Availability heuristics help us to process the likelihood of an event occurring based on our recall. We most readily remember recent events.
But relying on local experience can be dangerous. If a worker forgets to wear their safety goggles while operating heavy machinery and experiences no negative consequences, they are likely to underestimate their risk of injury -- and begin to neglect the protection.
Anchoring heuristics cause us to make decisions related to associations in our memories.
Availability heuristics help us to process the likelihood of an event occurring based on our recall. We most readily remember recent events.
If a worker is seriously injured early in their career, similar situations discussed even years later may pull on this anchor. Depending on how they dealt with the event, their reaction may be a raised level of awareness and understanding or withdrawal and resistance.
Representativeness heuristics help people to predict the probability of something happening based on available information. If you ask a new employee how many accidents they would expect in the next year, they will probably want to know your organisation's annual average. If there have been three to five serious safety issues a year for the previous ten years, it is most likely that they would use those figures as a basis for their estimate. This is known as the "base rate".
The representativeness heuristic is significant in the safety world. When base rates look favourable, workers can be lulled into a false sense of security.
The affect heuristic comes into play when an individual allows their preferences and biases to influence their decision making. Humans are far more likely to follow information and advice from those they find attractive.
This can be dangerous if, for example, a worker is given two contrary pieces of safety advice. Rather than look for the "correct" answer, they are far more apt to believe the person with whom they have more affinity. An individual's bias often causes them to associate people they consider unattractive or unkempt with stupidity and laziness.
Heuristics function without conscious thought and allow the formation of fast answers in difficult situations by bypassing the need for extended thought. But if the logic behind them is flawed, trusting them can lead to poor decision making.
In Thinking, Fast and Slow, Kahneman explains how trusting "gut instinct" can lead to huge errors of judgement, and that learning to act more mindfully will help people to make better decisions. It's a great aid to OSH professionals keen to fast-track their thinking and slow down the rate of accidents in their organisations.