From the archive: Just so you know, this article is more than 3 years old.
Alternatively, power station engineers have remembered they've left a tool inside a turbine that's being refurbished. The consequences could be serious. In military aircraft maintenance, this book's specific focus, a simple failure to replace a lube oil tank cap could lead to an engine bay fire or total power failure, while forgetting to reconnect a flying control could cause a fatal crash just as the machine is airborne.
This "recall of error" phenomenon suggests that our unconscious minds constantly sift through what has happened; they can then go into "alarm" mode, with a sudden message transmitted to the conscious mind, here called "individual latent error detection". This study summarises the authors' research into the process, setting it in the wider context of human error generally. Their work explores the errors people make, how far it is possible to prompt these lightbulb-recall moments, and what can be done to minimise the risk.
For all its academic rigour, this book is surprisingly easy to read
Drawing on the latest thinking from the likes of James Reason, Sidney Dekker and Erik Hollnagel, the authors emphasise that human error is not a cause in itself but a pointer to underlying system faults: the ways of working that make us vulnerable to error. They also recognise that some workers are more error-prone than others but qualify this by noting that many such people have well-developed coping mechanisms that compensate.
For all its academic rigour, this book is surprisingly easy to read. Understanding is helped by the regular references to real-life situations that keep the narrative grounded in the here and now. The authors also apply good academic discipline by challenging the validity of their approach and looking to find flaws in their own research method. All the questionnaires, self-report forms and other key documents they used are included, as are photos of typical maintenance operations.
For anyone involved in engineering safety, the message that people often detect and correct their own mistakes is both good news and a challenge. The answer does not lie in another layer of checks (this can lead to complacency if people think there is a safety net should they make a mistake) but in helping people to take responsibility for their own work. Examples include key-word prompts in job cards and permits ("all tools removed"), providing checklists, and making the most of visual cues such as photos. In summary, this is a genuinely new insight into how the risk of mistakes can be reduced.
Having penned a few titles (The Safety Anarchist and The Edge of Heaven) which some readers may have found unusual or particularly personal – as he strayed from his typical narrative style – Dekker is back in familiar territory with The Foundations of Safety Science.
People Power is very much a book that reflects its time; as its subtitle suggests, this really does feel like 'the era of safety and wellbeing'. In this respect, the author does a fine job of mapping out how the perceived momentousness of this historical milieu might play out in the real-life work environment.