Researchers call it individual latent error detection (I-LED) but it's a phenomenon we all recognise in our behaviour at home and at work. We settle down for the night, but we suddenly remember we haven't locked the back door. We iron our best shirt or blouse before setting off for that big family reunion then, a quarter of a mile down the road, we suddenly recall that the iron might still be on. Or at work, we suddenly remember that we can't go to meeting A because we are already committed to meeting B. We open our calendars to check and find it is indeed so.
How you respond when someone raises a concern is critical... It is essential to be positive
Change the setting to one where errors can have more serious consequences and the stakes inevitably rise. The original research that identified I-LED as a distinct phenomenon was carried out in the UK's Royal Navy, specifically during aircraft engineering operations on ships and in land-based air squadrons. It takes little imagination to see that the consequences of a simple failure to fit or correctly adjust a safety critical component (such as a flight or engine control) could be a crash, possibly killing or seriously injuring the crew, along with damage to, or even total loss of, an asset worth millions (see "Maintenance error" box below, for one example).
Maintenance error: BAC 1-11 explosive decompressionAs a British Airways BAC 1-11 from Birmingham International to Malaga climbed above 5,200 m, the newly replaced left-hand cockpit windscreen was blown out by cabin pressure. The captain was sucked halfway through the aperture and had to be pulled back by cabin crew. The co-pilot flew the aircraft to a safe landing at Southampton. Investigators found that the diameter of 84 of the 90 securing bolts was smaller than specified. They criticised the fact that the installation had been the sole responsibility of a person working alone at night and there had been no independent test before the aircraft resumed operation. They also identified insufficient care, poor trade practice, disregard of company standards and the use of unsuitable equipment as crucial maintenance failings. |
Wider application
There are many other scenarios in which innocent errors can have catastrophic consequences. Leaving tools or other foreign objects in expensive machinery that is being serviced, giving the wrong drug, or the right drug in the wrong dose, to a seriously ill hospital patient or missing out a step in a manufacturing process are just three. We need only to look at some of the disasters in the oil, gas and transport sectors for the part played by simple errors (often made in stressful moments) to become apparent: starting up equipment that had bits missing (Piper Alpha disaster), pumping flammable liquids into vessels that were already full (Texas City Refinery explosion, Buncefield fuel depot fire in Hertfordshire, UK) and taking roll-on roll-off ferries to sea with their bow doors open or insecure (capsizes of the MS Herald of Free Enterprise in the North Sea and the MS Estonia in the Baltic) spring to mind.
In all these examples, there is far more to them than an error on the part of one or two people. We now know better than to determine the cause as "operator did not take enough care": modern thinking prompts us to look for the underlying systemic factors that either made such errors possible -- or which failed to stop a chain reaction in which errors were able to trigger catastrophic consequences.
I-LED: the discovery
This brings us to the light-bulb moment that led Royal Navy air engineer officer commander Justin Saward to investigate I-LED in more detail. One day, an off-watch engineer phoned in to query whether he had carried out a procedure correctly. Saward discerned a pattern of these "recall of possible error" moments and realised their potential implications for safety.
He says: "In the navy, we have many factors already stacked in our favour. For example, in a military environment, we are fortunate to have highly trained and disciplined people who follow robust systems. What's more, they typically have a very high degree of safety awareness and consciousness. And yet, errors were occurring and people were concerned about them."
Having identified the I-LED phenomenon from these repeated examples, he worked with Southampton University's Professor Neville Stanton to research the principle in more detail. The outcomes appeared first as Saward's PhD thesis and were then developed into a book (see below) -- Individual Latent Error Detection (I-LED): making systems safer.
The mental process
Stanton says: "We use 'schemas' -- mini-maps for how we do things. But we also have a mental process in which the mind compares what we have done with what we should have done. This involves both the conscious and the subconscious. The concern comes from the subconscious; the conscious mind can, though, become aware either through a sense of unease -- the feeling that all is not as it should be -- or through one of those sudden 'I didn't lock the door' thoughts." (See two examples in boxes below.)
Safety advisers will recognise that how people respond to rules and procedures and adapt their behaviour is critical to safety. As Saward says: "A systems approach that promotes safe behaviours, both at the organisational level and among individuals and teams, is vital. But we need to recognise that highly trained and experienced workers naturally habituate to the requirements of the job they have learned to perform."
Case study: Oil and gas sector, I-LED during coringCoring is a process in which special equipment is used to extract physical material from the well. With the entire drill string in position, which could take hours to set up, the driller will increase the pressure on the drill pipe and allow a drop ball to fall into place. This stops mud and drilling fluid going inside the inner core barrels. One operator gave this account of his recalled error: "I was relaxing in my room after hours and realised I had forgotten to insert the ball. By this time the equipment, without the ball, was well on its way to the bottom of the hole. "I then went and told the driller and well site leader (the most senior person on the rig for that company) that I remembered I had forgotten to put the ball in place. "They then had to pull out of the hole, after hours of running: this has to be done slowly due to potential gas release. So, all in all, down time was maybe a day or more, which was a lot of time and money wasted. It wasn't a disaster but certainly could have cost the company hundreds of thousands, if not millions of pounds. Yes, I was guilty of this at least once!" |
Stanton adds: "It is a normal human characteristic to develop semi-conscious and unconscious control modes when routinely operating plant, conducting maintenance and carrying out procedures."
The danger here is desensitisation to the task's risks, even in high-consequence sectors such as construction, healthcare and nuclear power. This is often misinterpreted as complacency, which is rare in workers in safety-critical organisations. When workers habituate to their workplace and become desensitised to risk, undetected errors can manifest in the safety system, especially when those workers are insufficiently aware of their actions (sometimes termed unsafe acts by safety professionals). This in turn can lead to undetected errors (forming a latent condition) in the safety system, weakening the organisation's overall performance in this area.
How understanding I-LED helps
For the safety manager or adviser, what is especially valuable about this work is that the researchers also explored the conditions in which I-LED is most likely to occur, along with how far it is possible to prompt these recall moments.
Case study: Oil and gas sector, I-LED during tank cleaningAn experienced worker in the sector recounts this dramatic example of an error that was recognised just in time to prevent a serious accident: "Tank cleaning is the most disgusting job in the world! To put this into perspective, a section of these tanks can be the size of a football pitch and up to 30 m deep. I can't even begin to describe the conditions. "Before people can go into the tank, it must be ventilated and freed from harmful gases. Hatches are removed and a gas detector used to check the atmosphere. Then, it is vital to put back the hatches (which can be 1-2 m in diameter) to ensure no one can fall into the tank. "During a nightshift, two people were standing about 2 m from an open hatch when a dayshift worker reappeared and suddenly asked them to walk slowly towards him. He remembered he had completely forgotten to close the hatches. Thankfully, I-LED here saved two lives: the workers were almost certainly seconds away from falling into the tank." |
How can we ensure workers' safety awareness is regained so that corrective action can be taken before an incident occurs? The research suggests we need to ensure relevant cues are in place to trigger recall. Also, unconscious or semiconscious control modes associated with skilled workers need to be brought back into the conscious so that people reconnect with the world around them. It is then errors are detected. Often recall is triggered by a checklist, process check, debrief, procedure (such as permit to work), buddy system or supervisory check.
In terms of practical action, some of the most important findings on triggering I-LED are:
- Recall is most likely to occur within two hours of the original error; if possible, therefore, allow an interval before restarting critical equipment.
- Encourage reflection during and at the end of a task, for example by using the "stop", "look", "listen" technique.
- Recall rates are increased by prompts, such as checklists in work documents.
- Visual prompts are especially effective -- perhaps a photo of the equipment as it should be in a safe state. It can be something as simple as a picture of a pen to prompt "have you written up the paperwork?".
- Take breaks between safety-critical tasks: they maximise reflection and error recall.
- People are most likely to be open about their recalled errors if you have a culture in which it is permissible to put your hand up and say, "Look, I think I have got this wrong -- or I may have got this wrong". Conversely, stigmatising those who report errors can only have a negative effect.
- How you respond when someone raises a concern is critical; as with hazard reporting generally, you can either motivate or discourage future disclosures by how you react. It is essential to be positive, whether or not the person's concern was justified.
- An extra level of checks -- if a second person such as a supervisor checks all work, say -- is not necessarily the right answer. Not only does this add to workload, but it can lead to a shifting of responsibility for errors.

A more resilient safety system will better protect against human error, thereby minimising harm to people, equipment and the environment. Employers are more likely to meet their legal, moral and financial obligations, and achieve their commercial/operational goals, through setting up a resilient safety system, than they are in trying to fix the human condition, which is to err.