Even robust data-gathering systems can be subject to sampling bias or other margins of error – while interpretation can add a second layer of distortion. So how secure are the available OSH statistics, and the decisions resting on them?
National statistics from the GB Health and Safety Executive (HSE) paint a broadly reassuring picture: one of the lowest fatal injury rates across Europe, with self-reported workplace injuries and ill health below the EU average (HSE, 2020). But how solidly built is this statistical edifice? There are limitations on the data gathering – via RIDDOR self-reporting by employers and a question subset of the Labour Force Survey (LFS). How do we know which cases have been missed, and what silence surrounds those variables that never appear in the survey?
But when these statistics serve as a foundation for national policy and enforcement action – as well as shaping employers’ own priorities and even the courts’ sentencing determinations if probability is an issue – some are asking whether the entire statistical edifice is solidly built.
Our definitions of workplace ill health are widening, taking account of exposure to diesel fumes, or noise linked to ‘hidden’ hearing loss, or work-related stress manifesting as a range of health conditions. Nevertheless, the means to measure ill health have stayed static.
Meanwhile, the RIDDOR dataset is subdivided into broad categories that many feel lack the granularity to reflect today’s changing economy. Hazards faced by self-employed contractors, those working in the gig economy or those in precarious employment could be expanding out of statistical sight.
Behind the headlines
Peter Kinselley CMIOSH, associate director of health and safety consultancy Cardinus, says he is concerned that the headline numbers presented in the HSE’s annual statistical briefs are too unwieldy to allow meaningful conclusions. ‘Clients often ask “What does good look like?” but there’s just no answer in the data,’ he says. ‘I can look at the figures on the overall cost to the economy or ill health rates, but it’s impossible for an organisation to benchmark against it, or to turn it into something an SME can use.’
That deficit has been more apparent in the past 18 months, Peter says, as the experience of tracking
the COVID-19 pandemic through plentiful data updates from Public Health England and local authorities drew attention to the way statistics shape organisational responses. ‘The biggest omission for me is around mental health. We tend to get big global figures, but nothing to build a strategy on.’
His views are broadly shared by Roger Bibbings MBE CFIOSH, former occupational safety adviser to the Royal Society for the Prevention of Accidents (RoSPA), who points out that RIDDOR primarily serves the HSE’s enforcement agenda but leaves the working world with limited insight.
‘The data is required by all the other players in the system, including health professionals, unions, employers’ organisations, professional bodies, and so on, that need information to assess UK OSH performance as a whole.’
The LFS is based on a representative population sample, a method that reveals a far higher burden of occupational ill health than that recorded by RIDDOR. But Roger believes that an additional survey sample could fill gaps in the HSE’s perspective, such as injuries and dangerous occurrences that aren’t reported under RIDDOR, and the hazards in the gig economy.
Roger argues: ‘The HSE’s statistical strategy has to shift from RIDDOR-based reporting to one based on high-quality, repeatable sample survey techniques. The regulatory bodies will find out far more about the state of things from constructing good samples of people than we do from pondering over the incomplete national data that comes from reporting.’
Lessons in statistics: Safety in numbers
Mike Stephens CFIOSH has written a short e-book, Safety in numbers: a first guide to the use of statistics in safety management. He would like to see better statistical literacy unlocking more insights, and directing resources to the right targets.
Health and safety managers might find that one quarter’s lost time accident (LTA) rate looks worrying, but a longer term ‘rolling’ average sets them in context. Or, where accident or ill health data is gathered alongside data on multiple variables, a ‘regression analysis’ in Excel can sort corelation from coincidence – between, for instance, higher accident rates and untrained workers.
Accident patterns can be examined by plotting a cumulative frequency graph, also known as an ‘ogive graph’. This once helped Mike to contextualise accident data on a manufacturing site where incident rates climbed, then peaked, then fell back. ‘By doing the ogive, I could pinpoint when a major reorganisation of the company started. When things got back to normal and the reorganisation stopped, it came over the top and started coming down again. You could detect the effect of an intervention,’ he explains.
Another tool in a practitioner’s armoury is predicting future accident levels based on historical data by using the Poisson probability formula. This can calculate a base case scenario – in other words, the likelihood of reoccurrence if no other variables change. ‘If you had, say, five LTAs this year, and you know your historical average, you can work out the percentage probabilities – for instance, is there a 90%, 10% or 5% chance of having six, seven or eight LTAs next year?
‘That gives the safety manager a basis of comparison if they introduce new interventions or controls, so it could be a useful technique to build a case. It also proves immediately to the board or management team that the practitioner is having an effect.’
On surveys, a tool almost every practitioner will be asked to use, Mike says it’s vital to know how many people must be surveyed for the responses to be statistically valid for the whole group, at a reasonable confidence interval. ‘If you have a co-worker population of 100, and a yes or no question, to get a reasonable view of the overall feeling and a confidence interval of plus or minus 5% you’ve actually got to survey 80 people, or almost all of them. All of those things need to be taken in mind before agreeing to do a survey.’
At IOSH, head of health and safety Ruth Wilkinson notes that the focus on OSH historical ‘lagging’ indicators in HSE and government statistics leaves an incomplete picture. ‘If we can see leading indicators as well, we get a better picture of what is going on, using data to be predictive and proactive about current performance and align controls and interventions to them to drive change and improvements. We need that blend.’
However, the HSE itself is confident that its systems accurately reflect workplace risk. A spokesperson says: ‘We have been consistently clear on the legal duty employers have to report incidents, and are committed to getting the most accurate picture possible. Working together to have a system based on consensus is one of the reasons why we have a health and safety record that is the envy of much of the world. This is equally true of the discussion around the future of data.’
Counting the cost in safety
Meanwhile, at enterprise level, health and safety practitioners work with their own statistics, or other researchers’ data. The basic statistical primer available in most health and safety courses can leave gaps in proficiency. Are trends and relationships missed because no one calculated a rolling average over a long enough interval? How valid is a survey with a 10% margin of error buried in the small print?
Do professionals collect enough data (in line with data protection requirements as per the Data Protection Act 2018 and the EU’s General Data Protection Regulations, for example) on the contextual variables – age, experience, size of the organisation – to confidently explain accident data trends, or do they have the statistical confidence to carry out a ‘regression analysis’ and present it to the board? This could reveal, for instance, the link between training and accidents, or clusters linked to shift or overtime patterns.
While all OSH syllabuses cover measuring, monitoring and evaluating performance, Ruth acknowledges that there can be a gap between theory and practice. ‘People learn the academic understanding, but when you get in the workplace, and need to apply it to that particular environment, the objectives, risk profile, controls and so on, and through analysis and evaluation. It’s a skill in itself to transition into application and practice, particularly around leading indicators which need to be identified and may be more difficult or time-consuming to measure.’
'We might have the statistics, but no access to the story behind them – the information that could yield insights'
Helene Wood GradIOSH, who has a master’s in occupational health, safety and environmental management, found during her research project that the statistics available to employers – staff wellbeing surveys, plus accident and sickness absence data – could form islands of insight that discourage ‘system thinking’, which links different aspects of the workplace experience. Managers might have the statistics, but no access to the story behind them – information that could yield insights and infuse other operational responses.
Referring to Rasmussen’s risk management framework, which argues that risk needs to be understood across boundaries, she asks: ‘If you’re in the health safety team, that you get certain information, your training team gets certain information, and so does the HR team and finance team. But how is that pulled together, within companies as well as across industries?’
Ruth notes that ISO 45001, the newly published ISO 45003 on psychosocial risk and the Global Reporting Initiative’s standard 403: Occupational health and safety (2018) will encourage employers to raise their game on data-gathering and interpretation, monitoring the effectiveness of the management system and controls, and creating new streams of statistical information for others to tap in to.
But it’s also true that companies adopting these voluntary standards are likely to be responsible and reflective employers in the first place, or are doing it for other means, such as investor requirement or supply chain requirements, potentially building respondent bias into these statistics.
Technology has boosted the profession’s ability to gather statistics – from equipment sensors and loggers, camera-based vision systems, body-worn sensors and smartphone or tablet apps that cut the latency between an incident or near miss and the final report. But it can also bring its own limitations: for instance, software accident reporting could be trammelled by the tick boxes on software platforms.
Helene says: ‘The metrics may suggest a focus on one aspect, potentially overlooking other issues, such as distraction. Platforms and reporting categorisations are beneficial, but you have to be cautious.’
Covid and riddor: Under the radar
The pandemic has also drawn attention to the GB HSE’s RIDDOR dataset. As the TUC points out in its May 2021 report, RIDDOR, COVID and under-reporting, just 387 COVID fatalities where workplace transmission was a factor were reported under RIDDOR from April 2020 to April 2021, from which the HSE has made formal enquiries into 216 cases. Employers also reported 32,022 infections. In contrast, the Office for National Statistics (ONS) recorded 15,263 COVID fatalities in Great Britain among the working-age population.
For the TUC, the inference that only 2.5% of fatalities to workers followed workplace transmission is evidence of widespread under-reporting, but also suggestive of a general gap between RIDDOR and the true picture on breaches of workplace safety regulations.
A spokesperson for the HSE says: ‘We don’t think it’s helpful or credible that in a pandemic where the virus has been consistently prevalent in communities, our recorded numbers of COVID-related RIDDOR are compared to the ONS figures of the entire working-age population who died from COVID.’
The HSE also acknowledges, in notes to its spreadsheet on COVID-19 data, that ‘RIDDOR suffers from under-reporting. Not all employers report cases as required under the regulations.’
At national level, Roger at RoSPA argues that the RIDDOR and LFS data sets focus mainly on lagging indicators, rather than also measuring and addressing other factors that would help employers and practitioners to assess cause and effect, or links between safety interventions and outcomes. ‘What we're really missing in the national- and sector-level picture is data on the extent and efficacy of inputs that employers are making to manage health and safety, particularly on things like training and learning from accidents, and also on their outputs such as exposure levels, for example, to threats like noise and vibration or respirable crystalline silica,’ he says. ‘Do we have good data to track the conditions that may give rise to ill health?’
RIDDOR also relies on employers reporting accidents, occupational ill health cases and dangerous occurrences, but awareness of the duty is not universal. TUC health and safety adviser Shelly Asquith says: ‘The main issue is that not enough employers know there is a legal requirement to report. The HSE needs the budget to improve its communications and get more information into workplaces.
‘The main issue is that not enough employers know there is a legal requirement to report. The HSE needs the budget to improve its communications and get more information into workplaces.’
Peter, however, believes that responsibility to wring the best possible data out of RIDDOR should be shared. ‘Organisations need education and encouragement to report. But could others do more to raise awareness and make sure issues are reported, such as the insurance industry.’
On RIDDOR, Roger believes that other anomalies could arise from ‘sociologically determined’ factors: for instance, the data on over seven-day non-fatal injuries could be biased towards larger employers with the resources to allow absence for the treatment of an injury; whereas in over-stretched SMEs, workers with the same injury might be rushed back to work within the week.
'OSH has mapped new measuring technologies on to analytic approaches that haven’t evolved'
From a statistical point of view, the data available in both RIDDOR and the LFS can only be analysed according to the variables collected, but adding extra columns to the Excel spreadsheets for example would allow more risk relationships to be calculated.
To interrogate what’s happening in the expanding ‘gig’ economy, for instance, the TUC wants to see finer grained detail on workers’ employment status, rather than just ‘employed’ or ‘self-employed’. It has also suggested that ethnicity and other ‘protected characteristics’ under the Equality Act added as reporting categories to RIDDOR and ONS LFS data.
Then there is the exclusion of work-related road accidents from HSE safety statistics, an anomaly that IOSH, RoSPA and the TUC all argue flatters the UK’s safety performance and creates a disconnect in safety management. ‘Work-related road injury is by far and away the biggest source of occupational injury. We’ve campaigned on this for 25 years,’ says Roger.
A spokesperson for the HSE says: ‘We don’t shy away from the fact that, statistically, RIDDOR isn’t our preferred source for work-related injury and ill health.’
Robin Pressley TechIOSH, a health and safety manager at Collins Contractors, has become increasingly concerned at the implications of non-reporting. ‘The sector has an emphasis on recording statistics, but this is an area where no one is fully recording statistics comprehensively, so our picture of the risk is skewed. It’s easy to have zero RIDDOR accidents if you don’t have to report any! From a health and safety management perspective it would assist with control measures if there was more information as to the cause of injuries while driving for work.’
In addition, as online retail shifts deliveries from large operators with responsible fleet management practices, to a growing army of freelance or self-employed drivers driving under considerable time pressure, the volume of uncharted work-related accidents is likely to be on the rise.
At IOSH, Ruth Wilkinson says: ‘We've been calling since 2001 to change the RIDDOR requirements, to include work-related road traffic accidents. And now we need to recognise that road risks are changing, for example, we have occupational driving within the gig economy, and we have autonomous vehicles coming, so this is an area for review.’
Staying up to speed
While some industrial sectors – such as waste management, utilities, or quarrying and aggregates – do collect statistics, others are reluctant to share and benchmark data that could perhaps tarnish corporate reputations. Peter, who finds the lack of benchmarking data frustrating, notes that some trade associations aren’t able to publish data ‘because of the political difficulties of calling people out’.
RoSPA is now hoping to contribute new impetus to the business-to-business benchmarking agenda, by securing HSE authorisation to re-use its 1999 guide (HSE INDG301 Health and safety benchmarking: improving together), which had been withdrawn by the HSE in 2011.
The pandemic has certainly ushered in a ‘reset’ moment across many areas of our working and personal lives. In addition, other areas of the economy are embracing better data as an engine of change – for instance, utility companies are hiring data analysts to action data from smart meters and Internet of Things devices, and educationalists are using pupil and cohort data to drive up standards.
OSH may have adopted new measuring technologies, but has largely mapped them on to analytic approaches that haven’t evolved with the modern world of work. We should be more ambitious in refining data-gathering and analytic processes to ensure OSH stays up to speed in a new data-powered era.
HSE sampling: How does the Labour Force Survey work?
The LFS from the ONS is one of the primary sources for the GB HSE’s workplace injury figures.
The latest survey (ONS, 2021), which took place between January to March 2021, was based on a sample of 37,098 households containing 82,015 participating individuals. The LFS sample design cannot guarantee adequate coverage of specific industrial sectors, so offers a less stratified look at workforce behaviour. The total response rate for Great Britain, excluding imputed cases, was 23.6%, down 2.5 percentage points on the previous quarter. This is part of a slow downward trend in response rates: in January to March 2011, the response rate was 50%.
In around one-third of cases, a proxy answers on behalf of the designated respondent. According to research focused on data from 1993-94 to 2003-04 (David and Jones, 2005), proxies report around 24% fewer cases of work-related ill health or injury than those answering questions on their own behalf. This suggests a degree of under-reporting in the final tallies.
The loss of face-to-face interviewing caused by COVID-19 led to a reliance on phone contact, and a sharp decrease in response rates (ONS, 2020). Non-response bias led to a decrease in interviewees who lived in rented accommodation and an increase in owner-occupiers. Tenure weighting has been used to try and address this imbalance and present a more accurate picture of a workforce living under COVID.
GRI standard 403: occupational health and safety Global Reporting Initiative (2018) [accessed 30 July 2021]
Health and safety at work: summary statistics for Great Britain 2020 Health and Safety Executive (2020) [accessed 30 July 2021]
RIDDOR, COVID and under-reporting TUC (2021) [accessed 30 July 2021]