Lexicon

P is for perception of risk

As psychologist Paul Slovic states at the beginning of his 1987 paper on risk perception in the journal Science: “The ability to sense and avoid harmful environmental conditions is necessary for the survival of all living organisms.” Similarly, helping people to sense and avoid harm at work is a necessary part of safety and health management.

Lexicon-P-is-for-perception-of-risk
Image credit: © iStock/santypan

Much of the research in perception of risk is on how the general public perceives threats such as nuclear power, vaccinations, transportation modes or smoking. However, awareness of these studies’ findings is useful when we consider how to communicate risk messages to workers, and where organisations’ activities affect the public.

The Norwegian Risk Research Committee (cited by Slovic) provides a definition of risk perception as “the risk we envisage, which results from how we assess the chance of a particular type of accident happening to us and how concerned we are with such an accident”.

This definition balances chance (or likelihood) with concern (or severity), but the layperson’s perception of risk is often biased. They fear high-severity but rare events more than low severity, more frequent events. Similarly, events that expose many people to risk at once are perceived as more dangerous than events in which people are killed in small numbers every day. Compare aviation, rail or shipping accidents, which are relatively few but with high casualities, with the daily death toll on our roads.

Slovic set out 18 psychometric factors influencing the perception of risk, grouped into two factors:

  • Dread risk, includes lack of control, catastrophic and fatal consequences, and an inequitable distribution of risk and benefit. Nuclear power has consistently scored high on this factor, while driving is low. Slovic explains that “the public” will accept risks from voluntary activities, such as skiing, that are roughly 1,000 times as great, as it would tolerate from involuntary hazards such as food preservatives.
  • Unknown risk, includes unobservable and poorly understood hazards, particularly where the impact of the hazard is delayed. Genetically modified crops, when they were first introduced, are a good example.

Risk perception theory tells us that people will overestimate the frequency of rare events, but any safety professional who has had their safety improvement proposal turned down with the retort “We haven’t had an accident in 20 years of doing it this way” might dispute this principle. Similarly, if people fear unobservable, delayed effects, why is it often so hard to get them to protect themselves from noise and respiratory hazards?

If people fear unobservable, delayed effects, why is it often so hard to get them to protect themselves from noise and respiratory hazards?

An alternative to the psychometric view of risk perception was offered by Aaron Wildavsky (Daedalus, MIT Press, 1990), who proposed that cultural and social factors were the key influence on our risk perception. Wildavsky’s ideas have not been as popular as the psychometric view, but may offer a better explanation of what happens in the workplace. A new worker underestimates the risk because more experienced people behave in a way that indicates there is little risk from taking a shortcut across the loading bay or not wearing ear defenders.

“Cultural biases provide predictions of risk perceptions and risk-taking preferences that are more powerful than measures of knowledge and personality,” Wildavsky concludes.

As long ago as the 1950s, Solomon Asch tried to show that when people are completely sure about a decision, social pressure cannot make them change. He showed subject lines of differing lengths, but then surrounded each subject with confederates who identified a clearly longer or shorter line as identical to the target line. Rather than sticking to their original judgement, 75% of subjects conformed with the team decision. Asch had to change his ideas too.

These pressures can work in both directions –  to cause people to see something as less risky, and therefore to accept an unacceptable risk (as with the risk misjudgement that led to the Challenger  air shuttle disaster (bit.ly/2mWwsG7) or to move to an overly cautious decision – such as local authorities banning hanging baskets of flowers.

Much risk perception research is about how to persuade people to be less worried about societal risks. In the workplace, we are usually more interested in creating a more accurate picture of risk, to get people to follow rules drawn up to protect them. But we should be aware of our own perceptions. As Slovic and fellow psychologists Baruch Fischhoff and Sarah Lichtenstein pointed out in the 1981 book Acceptable Risk, there is no such thing as an objective evaluation of risk. At best, assessments “represent the perceptions of the most knowledgeable technical experts”.

 

bridget_blue_head_and_shoulders_1.1_sml

Bridget Leathley is a freelance health and safety consultant, providing risk management support in facilities, retail and office environments.  She delivers face-to-face safety training including IOSH and bespoke courses, and contributes to e-learning courses through evaluations and design work.  She has been writing for health and safety publications since 1996.  

Type : 
Topic :  
Issue : 

Add new comment