The origins of task analysis lie with time-and-motion studies from the early 20th century. In 1911 Frank Gilbreth published a study of bricklayers he had carried out with his wife Lillian Gilbreth, an industrial psychologist and engineer. They documented each separate movement and decision involved in laying a brick.
By developing tools and job aids (such as easily adjustable scaffolding) and rewriting the work procedures to change the way the “low-priced men” positioned the bricks for the “high-priced men”, they reduced the number of movements for the latter, claiming a reduction in worker fatigue, improved efficiency and lower costs.
Task analysis continues to be a useful way to look at how a job is done, whether for physical tasks such as bricklaying, cognitive activities or, more often, ones that combine both types. As well as the Gilbreths’ use of the process to improve task design and allocation, it is deployed in risk assessments, to evaluate workload and as the basis for training needs analysis.
There are multiple techniques available, but all task analysis processes have five steps in common:
- define the purpose of the analysis and its boundaries
- collect the data
- describe the tasks
- analyse the task description
- review, revise and improve.
Purpose and boundaries
Hierarchical task analysis
Without a clear statement of the purpose and the boundaries of the project, task analysis can result in “analysis paralysis”. Being clear about the expected output, and the jobs, locations, equipment, time periods and substances covered will inform the stopping rules essential to prevent this.
Shaun Lundy, technical director at 4site Consulting, used task analysis as the basis of a training needs analysis for Colt Telecom in 2003, limiting the scope to two jobs: installer and customer site planner. In other cases, the scope might be much wider. A 1997 study in the International Journal of Industrial Ergonomics by Craig Halpern and Kenneth Dawson (bit.ly/2Oltff5) describes a task analysis of sewing jobs in an automobile products factory, which considered ergonomic design, process changes, rest periods and even the development of a structured exercise and stretching programme. But the analysis still had limits, concentrating on six jobs responsible for the greatest number of musculoskeletal disorder (MSD) claims.
Stephen Butler, a workplace safety and health specialist in Australia, had a clear purpose for his task analysis of water main cleaning and relining: to produce clear work instructions specifying precisely “what equipment to use at each point in the task, who is authorised, where to stand, how to lift, who to involve and when”.
It is possible to extend the boundaries of a task analysis to consider the safety management system itself
Stating the purpose of a task analysis as “risk assessment” needs clarification. A simple one-layered task analysis can be used as a tool for a hazard identification study used in process safety. However, the task analysis used by Janette Rose and Chris Bearman to evaluate the impact of a new in-cab computer system on train safety had multiple levels – described in Applied Ergonomics (bit.ly/2Ab6rf3), with each task broken into further sub-tasks (a hierarchical task analysis, HTA. See box above).
From the HTA the researchers could identify what might go wrong, suggest risk controls and re-evaluate. Another rail-based study by Australian academics Anjum Naweed, Ganesh Balakrishnan and Jillian Dorrian (Applied Ergonomics, bit.ly/2OpTQYy) used a similarly detailed approach to look at hazards introduced by a change from two-driver operation on freight trains to single-driver operation.
The new management standard ISO 45001 encourages not just the assessment of workplace hazards but also the assessment of risks to the OSH management system itself. The guidance to the standard (ISO 45002) offers examples such as inadequate allocation of resources, ineffective audit programmes or poor succession planning. So it is possible to extend the boundaries of a task analysis to consider the safety management system itself.
The same task description can be used for more than one purpose. When HSEQ project manager at 3JV Sean Walker produced a detailed task description to consider how to manage the risks associated with confined space work (CSW), he was able to use the same task description to design the training course needed for the new CSW process.
Though the effort you invest must to be proportional to the scope of the project, better data will lead to a better analysis. Most successful task analyses use more than one data source and cross-reference the results. At its simplest, you might start with a written procedure for the task to be analysed and cross-refer to observations of whatever is being carried out. Lundy used job descriptions and method statements to familiarise himself with the tasks, and then made observations, followed up with interviews.
In both the rail studies the researchers used two forms of observation – “passive” observation, in which they watched normal operations, and a talk-through protocol, which involved the driver explaining what they were doing as they did it.
Kevin Robson, management consultant at RTMS Global, admits to using a different form of observation: “I discuss the current procedure with the operators first to understand what they feel that they are required to do. I then observe them semi-covertly to see if they do what they say they do. Sometimes they tell you what you want to hear but actually do something else.”
Both rail studies used one-to-one interviews and group discussions. The groups created scenario descriptions, stimulating further discussion of the roles and the factors associated with safe operation. If working with groups, watch for group biases – such as the tendency for someone to self-censor if they think their approach is different to the majority. But a group discussion can also uncover details that might not be revealed in interviews and would not be obvious from observation.
In both rail studies several drivers were consulted because basing a task description on how one person thinks the job should be done is less likely to provide an accurate picture. Walker agrees: “I sat down with the workers for each and every CSW, from cleaners, mechanical, electrical and any other trade that entered any of CSW.”
Butler combined discussion with observation. “As they worked, I talked to the work crew about the safest, most effective method of carrying out the task,” he says.
Another increasingly popular method of data collection is video and photographs. Even back in 1997, Halpern and Dawson videoed the sewing machinists to make it easier to catch the detail of each task. Robson uses supplier or manufacturer videos of products if they are available, and if they aren’t, “I take videos or photos of my own, for example of a work area or equipment”, he says. Butler took photographs of every step of the water main cleaning process, using these not just for analysis but also to include in the documented instructions that were the project’s goal.
Describe the tasks
The simplest method of task description, used by Walker and Robson, is a single list of subtasks. The approach used by Lundy and Rose and Bearman is the HTA, which starts with a simple statement of the task goal and can be refined as much as it needs to be. However, it is important to know when to stop. To explain, see the two branches of the tree in the figure on p 35, based on the Rose and Bearman rail study.
The top goal is: drive the train from its origin to its destination safely. This is subdivided into seven further tasks, though only two are shown in the diagram. The first subtask, monitor and respond to external events, is divided and subdivided to three further levels, while the second subtask, monitor and respond to gauges and alarms, has just one further level.
In their 1992 book A Guide to Task Analysis, Barry Kirwan and LK Ainsworth state: “The analyst can stop where he or she feels it is justified.” Lundy’s task analysis went to three or four layers. “I stopped when further description lost its usefulness,” he says.
A simple rule would be: “From the description, would an untrained person know what to do without further explanation?” “Make a cup of tea” could involve leaves or bags, a pot, a cup or a mug, served with or without milk or sugar; “plug in the kettle” would be interpreted only in one way.
Though the probability of making tea in different ways is high, the consequence is low – does it matter whether leaves or bags, a pot or a mug are used? The “PxC” stopping rule proposed in 1967 by John Annett and Keith Duncan takes account of both the probability of an operation being performed inadequately by an untrained operator (P) and the consequence of inadequate performance (C).
When PxC is small enough (an acceptable risk) the analysis can stop. The probability of failing to “monitor speed to ensure within speed limit” is low (even though C might be high), so there is no need to go to the next level. However, “comply with relevant signs” has a high C and a high P since, without knowing what the signs are and what to do at each one, it would be easy for an untrained operator to make a mistake. Only when the analyst reaches stage 184.108.40.206.1 in our rail diagram (see above) – check results of braking application – can the analysis stop.
PxC calculations should not be treated as mathematical certainties any more than scores of one to 25 in a risk assessment matrix would be, but the concept can help to determine whether a further iteration is needed.
Although popular, HTA is not the only detailed technique. The study of two-driver to one-driver operation also used a timeline to show where the latter would result in the driver needing to do two things at once, and therefore having to make a decision about which task to omit or delay. Robson is flexible in his way of describing the task for analysis. “Whatever works,” he says, sharing examples of flow charts for emergency evacuation and project risk assessments and tables for process and maintenance operations.
For Butler, the photographs were critical for “detailing the sequence of the activity steps” pictorially.
David Ramsay, group managing director at incident investigations outfit Kelvin TOP-SET, uses task analysis in his accident investigation work, honed through 30 years’ experience of investigations in high-hazard industries: “We would analyse tasks in detail, step by step, to find out what was going on in the period preceding the incident.”
To represent the data for analysis he suggests an approach from the chemical industry. “I’m increasingly in favour of bow-tie diagrams. They give a clear view of which barriers need to be in place to prevent incidents and which defences need to be in place to mitigate any failures.”
Task description analysis
Describing the data is the first stage of analysis, which usually involves modifying or annotating the original description. The timeline analysis in Naweed et al’s study was reviewed to find out where conflicts would occur when moving to a single-driver operation, and redrawn to see the impact of the driver stopping the train to deal with a problem.
The method of analysis will depend on the purpose of the study defined in step one. If a risk assessment, each subtask should be risk assessed; if a training needs analysis, ask “what competence does this subtask need, and how is it best provided?”
In one of Robson’s examples, the analysis of each subtask includes identifying equipment, resources, conditions, hazards, current and proposed control measures, risk ratings and responsibilities. Walker’s approach is straightforward, with just two actions for each task step: identify the safe behaviours (such as “set-up anchor points”) and identify at-risk behaviours (such as “catching fingers while coupling”).
Analysis can make use of existing taxonomies. Denham Phipps, George Meakin and Paul Beatty created an HTA for the planning and delivery of anaesthesia (bit.ly/2vppQE3), and then “each task step at the lowest level of the hierarchy was classified according to whether it was primarily skill-based, rule-based or knowledge-based”.
In another healthcare study in 2006 Rhonda Lane, Neville Stanton and David Harrison used an HTA to describe drug administration in a hospital from prescription through dispensing to monitoring. Each bottom-level task had one of 24 systematic human error reduction and prediction approach error modes assigned, such as action omitted, check incomplete and wrong information communicated. Each error mode was then assessed to see what additional controls could be provided. Solutions included improved labelling with colour coding and changing storage locations.
Detailed taxonomies might be too much for a straightforward OSH job, but reminding yourself of the models available could be useful if your purpose is to identify problems, design job aids or determine training needs.
Review, revise, improve
Our final stage and a key success criterion mentioned in many of the studies, and by practitioners who have used task analysis, is the need to review, revise and improve. After the new CSW systems were introduced, Walker wanted to review how they worked in practice: “I was on site everyday looking at all entries into confined spaces to iron out any issues that arose. There were a couple of things to change – nothing overly major. After about six weeks I moved to weekly reviews.”
Butler’s task analysis was also under constant review: “It was often updated, for example, if a worker’s responsibilities changed, the introduction of new technology or a change in plant or equipment.”
A task analysis can’t capture all information needed, and some projects supplement them with tools to measure pain, posture or handling forces. Naweed et al pointed out that their HTA couldn’t show psychosocial factors, such as boredom in a single-driver operation, or whether a second driver might be a distraction.
However, the benefits are clear. Halpern and Dawson’s improvements reduced workers’ compensation claims for MSDs by 87% in the two years after the intervention.
From his experience, Lundy cannot imagine anyone doing a risk assessment or a training needs analysis without some form of task analysis: “A risk assessment without a task analysis will inevitably be flawed. A systemic training needs analysis with a good underpinning task analysis will result in more pertinent training.”
The time and effort of the analyst and the need to involve stakeholders appear to be two of the reasons the technique is not more widely adopted. Robson has experienced resistance to task analysis.
“The objections usually come with regards to time and hence costs, and the delays in getting the job from receipt of order to the client,” he says. “While the main argument will always be health and safety, I sell the idea on the basis that the task analysis identifies inefficiencies and opportunities for saving money.”
It can also help if there is a recognised problem. “This is far easier if they have made a mess of a job in recent memory,” says Robson.
Butler’s project followed a serious degloving incident, and Walker’s work was authorised because there had been high-profile CSW fatalities in other organisations, and an off-the-shelf CSW system from a consultant had been “shoehorned in”. As well as improving safety, Walker believes the project had a cost-benefit: “The task analysis stripped the task right down to its core and we reduced the number of people required to stand at certain CSW locations because we knew the task-specific hazards.”
Butler points out another benefit: “Task analysis allows for all workers to be involved in improving safe working. Based on my experiences, the workers become passionate to ‘get it right’ and are then keen to ensure it ‘stays right’. It’s a true collaborative approach to safer working.”
Butler’s experience of the power of task analysis to engage is reflected by Robson: “The ISO standards [9001, 14001 and 45001] all demand leadership, commitment and a risk-based approach, making task analysis an essential element.”