Skip to main content
IOSH Magazine: Safety, Health and Wellbeing in the world of work - return to the homepage IOSH Magaazine logo
  • Visit IOSH Magazine on Facebook
  • Visit @ioshmagazine on Twitter
  • Visit IOSH Magazine on LinkedIn
Non-verbal communication
How to build trust
March/April 2023 issue

Main navigation

  • Home
    • Browse previous issues
    • Member accolades
    • Member tributes
  • Health
    • Mental health and wellbeing
      • Bullying
      • Drugs and alcohol
      • Mental health
      • Stress
      • Wellbeing
    • Musculoskeletal disorders (MSDs)
      • Ergonomics
      • Manual handling
      • Vibration
    • Occupational cancer
      • Asbestos
      • Hazardous substances
      • Radiation
  • Safety
    • Incident management
      • Chemicals
      • Electricity
      • Fire
      • First aid
      • Slips and trips
    • Non-health related fatalities
      • Road safety
      • Work at height
    • Risk management
      • Confined spaces
      • Disability
      • Legionella
      • Lifting operations
      • Lone workers
      • Noise
      • Personal protective equipment
      • Violence at work
      • Work equipment
      • Workplace transport
  • Management
    • Human factors
      • Accident reduction
      • Behavioural safety
      • Control of contractors
      • Migrant workers
      • Older workers
      • Reporting
      • Safe systems of work
      • Sickness absence
      • Young workers
    • Leadership and management
      • Employee involvement
      • Management systems
    • Management standards
      • ISO 45001
      • ISO 45003
    • Planning
      • Assurance
      • Compliance
      • Emergency planning
      • Insurance
    • Rehabilitation
      • Personal injury
      • Return to work
    • Strategy
      • Corporate governance
      • Performance/results
      • Regulation/enforcement
      • Reputation
    • Sustainability
      • Human capital and Vision Zero
  • Skills
    • Communication
    • Personal performance
      • Achieving Fellowship
      • Career development
      • Competencies
      • Personal development
      • Professional skills
      • Qualifications
    • Stakeholder management
    • Working with others
      • Leadership
      • Future Leaders
  • Jobs
  • Covid-19
  • Knowledge Bank
    • Back to basics
    • Book club
    • Infographics
    • Podcast
    • Reports
    • Webinars
    • Videos
  • Products & Services
  • Management
    • Human factors
      • Sickness absence
      • Accident reduction
      • Behavioural safety
      • Control of contractors
      • Migrant workers
      • Older workers
      • Reporting
      • Safe systems of work
      • Young workers
    • Leadership and management
      • Employee involvement
      • Leadership
      • Management systems
    • Management standards
      • ISO 45001
      • ISO 45003
    • Planning
      • Assurance
      • Compliance
      • Emergency planning
      • Insurance
    • Strategy
      • Corporate governance
      • Performance/results
      • Regulation/enforcement
      • Reputation
    • Sustainability
      • Human capital and Vision Zero
  • Health
    • COVID-19
    • Mental health and wellbeing
      • Bullying
      • Drugs and alcohol
      • Mental health
      • Stress
      • Wellbeing
    • Musculoskeletal disorders (MSDs)
      • Ergonomics
      • Manual handling
      • Vibration
    • Occupational cancer
      • Asbestos
      • Hazardous substances
      • Radiation
  • Safety
    • Incident management
      • Chemicals
      • Electricity
      • Fire
      • First aid
      • Slips and trips
    • Non-health related fatalities
      • Road safety
      • Work at height
    • Risk management
      • Confined spaces
      • Disability
      • Legionella
      • Lifting operations
      • Lone workers
      • Noise
      • Personal protective equipment
      • Violence at work
      • Work equipment
      • Workplace transport
  • Skills
    • Communication
    • Personal performance
      • Career development
      • Competencies
      • Personal development
      • Qualifications
      • Professional skills
      • Achieving Fellowship
    • Stakeholder management
    • Working with others
      • Leadership
      • Future Leaders
  • Transport and logistics
  • Third sector
  • Retail
  • Mining and quarrying
  • Rail
  • Rehabilitation
    • Personal injury
    • Return to work
  • Utilities
  • Manufacturing and engineering
  • Construction
  • Sector: IOSH Branch
    • Sector: Northern Ireland
    • Sector: Midland
    • Sector: Merseyside
    • Sector: Manchester and North West Districts
    • Sector: Ireland East
    • Sector: Ireland
    • Sector: Edinburgh
    • Sector: Desmond-South Munster
    • Sector: Qatar
    • Sector: Oman
    • Singapore
    • Sector: South Coast
    • Sector: South Wales
    • Sector: Thames Valley
    • Sector: Tyne and Wear
    • Sector: UAE
    • Sector: West of Scotland
    • Sector: Yorkshire
  • Healthcare
  • Sector: Fire
  • Sector: Financial/general services
  • Sector: Energy
  • Education
  • Sector: Communications and media
  • Chemicals
  • Sector: Central government
  • Catering and leisure
  • Agriculture and forestry
  • Sector: Local government
  • Sector: IOSH Group
    • Sector: Financial Services
    • Sector: Sports Grounds and Events
    • Rural industries
    • Sector: railway
    • Public Services
    • Sector: Offshore
    • Sector: Hazardous Industries
    • Sector: Food and Drink
    • Sector: Fire Risk Management
    • Education
    • Construction
    • Sector: Aviation and Aerospace
Quick links:
  • Home
  • Categories
  • Topics
  • Management
  • Human factors
  • Safe systems of work
Work equipment
Manufacturing and engineering
Features

Automata uncaged

Open-access content Monday 1st February 2016
From the archive:  Just so you know, this article is more than 3 years old.

Science fiction impresses upon us that perfection in the robot world is to replicate a human in every way, from appearance to emotions. But in equal measure it uses those traits to scare us, Blade Runner style, about what happens if the boundaries blur between people and machines. In either case, it sets up the expectation that robots should be the same shape as us and think like we do; which is still far from the reality.

Many of those assumptions have seeped into the public consciousness; ask people to describe the ultimate robot assistant and you'll get Star Wars' C3PO, not R2D2: a machine they can have feelings about and that can somehow have feelings back.

Not meant

In the real world, robots have taken two different paths. We have industrial automata, designed purely for function and reliability and epitomised by the Unimate robot arm seen in factories the world over since the 1950s, and we have human interactive robots designed to tug on the heart strings, such as the somewhat disturbing Paro robotic seal used to keep old people company in Japan.

Until recently there has been little crossover between the two worlds. Industrial robots live in isolation, working in secure workcells surrounded by wire mesh and interlocks, where they can do no harm to their fleshy colleagues. There is a handful of accidents involving robots worldwide each year but almost invariably they happen during maintenance or setup, when the interlocks are defeated and an engineer is inside the cage. The robots are not responsible for those incidents any more than someone using a grinder without safety glasses could hold the sparks responsible for an injury.

There is no attempt to teach the robot how to behave around people, nor any need to. Countries that have written standards for industrial robot safety, such as the US safety administration OSHA's STD 01-12-002 (http://1.usa.gov/22a2snC), make it clear that the only acceptable solutions are to isolate humans and robots from one another by fences and interlocks. We live in our world; they live in theirs.

Predictions from the 1950s that by now we would all have domestic robots cooking our breakfasts and babysitting our cats seem ridiculous, don't they?

Opening envelopes

Or perhaps not. The definition of a robot is any machine that can be programmed to carry out a sequence of actions without human interaction. You can buy timed feeding bowls to deliver sustenance to the front end of your cat while you are away, and automatic litter trays that deal with the other end of the problem.

The international standard for robotics, ISO 8373 (bit.ly/1lXJEqN), defines an "industrial manipulator" as: "a machine in which the mechanism usually consists of a series of segments, jointed or sliding relative to one another, for the purpose of grasping and/or moving objects usually in several degrees of freedom".

But that category makes up a tiny fraction of the robot population. For every factory robot juggling car parts there are ten robot vacuum cleaners and lawnmowers beetling about people's homes. There's little public association between the two, and little fear over safety in either case. Justifiably so, because the industrial robots live in their cages and the domestic ones are too small and innocuous to cause any real harm.

The problem comes when you take a robot that has the strength and functionality to cause major damage and move it from the workcell into the workplace -- putting people within the machine's envelope of movement. This is now something that researchers the world over are trying to do, and it raises a number of questions for designers, safety and health policymakers, philosophers and the public.

Developing a machine that can cope with the complexities of our human environment is a monumental engineering task, and it's certainly true that we don't yet have the computing power to build anything remotely like C3PO. But a robot that can sit alongside humans on a production line is already on sale. The limiting factor in development is a more fundamental question: should robots be intrinsically safe?

A robot that can sit alongside humans on a production line is already on sale. The limiting factor in development is actually a more fundamental question: should robots be intrinsically safe?

It's not a new question by any means. Before workplace safety management evolved, many people went home at night because they knew not to get too close to a steam hammer. In the past 45 years we have been closing down the exceptions to that majority, on the principle that we can make every workplace safe, securing hazardous machinery behind the guards and locks. A healthy dose of fear is still essential in some jobs, such as work at height or firefighting, but if you told the Health and Safety Executive inspector that your factory floor risk assessment was based on workers scampering for cover whenever the crane moved, you would find yourself looking for a new job.

Question of ethics

If robots on any scale are to work among us without causing injury it would help if they could work out the implications of their actions. Step in Professor Alan Winfield and his consequence engine. Winfield researches cognitive robotics at the University of the West of England's Robotics Lab and was keen to develop a machine that went beyond the current state of protection, in which a robot programmed to detect a human presence can simply halt its own movement where it could harm the human. This is what US robot ethicist James Moor defines as an "implicit ethical agent". Winfield believes he has gone one rung higher in Moor's hierarchy and created an "explicit ethical agent", one that can reason about ethics, if only to a limited extent.

His software, the consequence engine, allows a robot to model the possible consequences of actions in meeting a directive to physically intervene to prevent harm coming to a human. The robot can then choose the best course of action.

To test the engine, Winfield and his staff set an NAO robot a test. (The 58 cm tall NAO, pictured right, may look like a cute toy but is a sophisticated tool with the capability to sense and react to its surroundings, available in the UK to academics only.) The NAO with the consequence engine -- named by the team the A-Robot -- is set walking towards a hole in the ground. It skirts the hole successfully. Next a "human" -- another NAO, called the H-Robot -- is introduced, also walking towards the hole. The A-robot has to choose how to meet its directives to keep itself safe while preventing the human from harm.

In the experiment, the robot models the consequences of its options: stand still, go straight ahead, turn left or right and, in almost all trials, turns to collide with the human to prevent them falling into the hole.

The programming, says Winfield, "appears to match remarkably well with Asimov's first law of robotics [see main text] ... The robot will avoid injuring (ie colliding with) a human (may not injure a human), but may also sometimes compromise that rule in order to prevent a human from coming to harm (...or, through inaction, allow a human to come to harm)".

But what if they tested its ethical judgment when there were two humans heading toward the hole and it could only save one? "Out of 33 runs, 16 times the A-robot managed to rescue one of the H-robots, but not the other, and amazingly, three times the A-robot rescued both," Winfield writes. What he found most interesting was the 14 instances in which the A-Robot failed to save either human. On the videos of the runs he says you can clearly see the A-Robot notice one human and start towards it, then notice the other and stop. At this point, he says, the robot, faced with an ethical dilemma (which to save?), loses valuable time dithering and saves no one.

"The dithering is fixable to an extent," says Winfield. "But if an ethical dilemma is impossible for a human, it will be impossible for a robot to resolve."

Winfield is also involved in another research programme led by Professor Michael Fisher, director of Liverpool University's Centre for Autonomous Systems. This work aims to derisk the choices made possible by the consequence engine by analysing all the possible actions it could generate -- using the verification procedures used to test safety critical software systems such as autopilots.

"They run every path through the system's logic through a kind of 'prover'," Winfield explains. "We want to be able to prove the robot cannot make the wrong decision." It is early days, he says, but the results are promising.

Singular problem

You could argue that robots are the epitome of predictability, so once you've seen one move it's easy for a human colleague to keep out of harm's way. It's true that the control languages for today's industrial manipulators are very simple and reliable but, to interact closely with humans in our world, that programming has to become a lot more complicated, and bugs are inevitable.

The mechanics themselves can be unpredictable; unless it's operated with great care, a multi-jointed robot arm can end up in a "singularity" where two joints line up, and the controller loses track of which motor moves in which direction. If it decides to execute "in time=0 do rotate=360°", standing next to it may not be the best option. The robot is still not to blame, but who is?

If we are to take robots out of their cells, then before anyone clicks a mouse we need to decide whether that robot should be absolutely incapable of causing any harm, or whether we should accept responsibility for staying out of the way.

The result may be some combination of the two, but there must be a single top-level priority for directing the detailed programming.

You may be thinking "well, it's obvious -” the robot must be safe," but it's not that simple. A Roomba vacuum cleaner is safe to be around because it's physically incapable of sucking your arm off, but a robot that's holding a car engine in position while a worker connects the wiring has to be strong enough to lift it, so it follows that it's strong enough to kill the worker.

Developers will often point to biochemist and author Isaac Asimov's laws of robotics, formulated in the 1940s, as the guiding light for their research. The first law is usually quoted as "A robot may not harm a human, or, by inaction, allow a human to come to harm".

That's a mis-phrasing of two ideas from different novels, one filtered by a translator. What Asimov actually said is: "Any modern robot-¦would weigh such matters quantitatively. Which of the two situations, A or non-A, creates more misery? The robot would take a quick decision and opt for the least misery."

The word misery suggests how stupendously complicated this "quick decision" is to turn into computer code (though there are people working towards it -- see box on page opposite). We cannot teach a robot to have morals, nor can we teach it how to empathise. We can give it a set of scenarios and tell it what to do but placed in the chaotic world we humans live and work in, no list of examples can ever be exhaustive.

If two pedestrians run out in front of a robot car and it is unable to stop, should it kill the mother or the child? Which death leads to less misery? A human driver would make a snap decision; a programmer cannot. Should a robot assistant on a production line prioritise the safety of visitors over employees? Managers over interns? Women over men? Who would dare to say?

Second guessing

The solution to date is to make the robot so over-cautious that it can always stop in time, but that leads to dangers too. In a set of nine accident reports released by the US Department of Motor Vehicles involving driverless cars tested on the roads of California, the common theme was that other drivers expected the vehicle in front to behave as they themselves did, with the same level of disregard for caution. Faced with a vehicle that drove slowly enough to be safe, they drove into the back of it.

If robots are to fit into our world, they may need to take some risks to keep up with us, and that places huge liabilities on manufacturers and programmers. Volvo recently announced that it wanted to take full responsibility for every accident its driverless cars were involved in, but the legal systems of the world don't work like that.

One option is to build in compliance -- adding flexibility to the mechanics so if the robot hits an obstruction the force is applied gradually

Giving suppliers total liability for everything a robot does would add several zeros to its price tag. Who would be to blame when something went wrong? The programmer who fed in the list of examples, or the operator who taught it the sequence of movements? Should a co-worker be allowed to tweak his metallic assistant's moral compass, and would you buy a car that would kill you to save a stranger?

External threats

We also have to accept that hacking is an increasing threat. We've seen attacks that alter process control software to cause physical damage in factories, and robots will inevitably be targeted by people wanting to snoop around through the on-board cameras, or just to cause chaos. Your mechanical co-worker might well have been factory programmed not to kill you, but where exactly did that "important software update" come from? Cybersecurity is nowhere near strong enough.

If we can't make a robot intrinsically safe, can we make it less able to do major damage and still be useful? In many of the deaths involving industrial robots a worker was crushed between the manipulator and the surrounding structure.

One approach is to cover the robot in soft padding so if someone did get in the way they could be gently pushed clear. This works well where the robot is moving very slowly in an open area, but for most workstations it's not possible -- the robot will be picking up things and putting them down, so there will be structures within the working envelope against which someone could be trapped. We can cover the area with sensors that detect where humans are, but even with redundancy we're not immune to software bugs. Those cages still sound like a sensible control measure.

Baxter on hand

The next option is to build in compliance -- adding flexibility to the mechanics so if the robot hits an obstruction the force is applied gradually. Combined with sensors to measure that force, the robot should stop before it injures anyone.

This is the system used by Baxter, an assistance robot built by Rethink Robotics, that is designed to operate in proximity to humans in assembly lines and education. (Baxter is priced around $25,000 (£19,000), the average salary of a US production worker.) It also uses a multitude of sensors to detect objects before it touches them, and it is as safe as a robot can get these days.

The drawback is that it must still move frustratingly slowly, and the compliance places severe limits on load capacity. Robots of this type are ideal for repetitive tasks, such as loading and sorting materials, but can't compete with a human for speed and strength. On a commercial level, you are often better off with a conventional robot in a cage.

We are left with the idea that robots capable of doing what we can do will always be dangerous and, if we intend to let them out of their cages, we humans should just learn to be careful. In an engineering sense it's the simplest option, but how does it fit with the ALARP (as low as reasonably practicable -- see Lexicon on p 59) risk management model? Just how much risk is acceptable, and who decides?

It's a question all of society should be concerned with; because they're almost here. But perhaps history will repeat itself. Perhaps workers will feel threatened by machines that replace their friends, no matter what they look like.

You may also be interested in...

©iStock/adventtr

 In short: Exploding gas cylinder hit worker in the head

Monday 8th February 2016
A pressurised gas cylinder was put through a shearing machine for processing on 16 June 2009 when it exploded. Tony Johnson, 55, was hit in the head by a large section of the cylinder and died from his injuries. The Health and Safety Executive’s (HSE) investigation found Walter Heselwood had not adequately assessed the risks from handling different types of scrap metal and failed to implement appropriate measures to minimise these risks, such as installing a blast wall.
Open-access content
HSE

 Recycler overlooked orphan cylinder risks

Wednesday 17th February 2016
The pressurised gas cylinder arrived at Walter Heselwood’s site in Sheffield hidden inside a water tank. Health and Safety Executive (HSE) inspector Kirsty Storer explained that orphaned pressurised gas cylinders often turn up in waste streams destined for recycling sites. Orphaned cylinders do not belong to major companies such as Calor Gas, BP Gas, Flogas and BOC Industrial Gases, which have retrieval arrangements.
Open-access content
Natural Insulation's failings related to controlling substances hazardous to health | Image credit: ©iStock/W6

 In short: Insulation co had poor COSHH assessment

Thursday 25th February 2016
Natural Insulation (formerly Black Mountain Insulation) did not conduct an adequate assessment for processing hemp and failed to adequately guard machinery. The investigation was carried out following concerns raised anonymously.
Open-access content

 Gary Booton, Rolls-Royce

Monday 1st February 2016
“Have you seen the dancing guy?” asks Gary Booton CMIOSH, some way into our interview at Rolls-Royce’s Learning and Development Centre in Derby. I’m starting to look around me when he explains he is talking about a viral video clip of a festival goer dancing alone idiosyncratically in a patch of grass. After a minute, the dancer gains a companion, and then a second; eventually almost no one is left sitting down.
Open-access content

 Greencore: Despatching MSDs

Monday 11th January 2016
I am watching two workers assemble small cardboard cartons, take filled triangular sandwich packets (bacon, lettuce and tomato) from a conveyor six at a time, fill the cartons and place them on a lower conveyor over and over. It would be an exaggeration to say their hands are a blur, but they are moving very fast, completing the whole carton assembly and filling process in around five seconds.
Open-access content

 Engineer's trapping costs National Grid Gas £1m

Tuesday 26th January 2016
The gas distributor was overseeing work to fix a gas leak on Ashby Road, Scunthorpe when pressure build-up burst one of the pipelines. One worker from the team of subcontractors was trapped between two gas pipes and sustained a broken thighbone.  The fire and rescue service worked for an hour in zero visibility to free the engineer, who was wearing breathing apparatus to protect him from escaping gas and the cloud of dust and debris it created.
Open-access content

Latest from Features

gy

 A big push on peat bog safety

Thursday 2nd March 2023
Adman Civil Projects’ new emergency rescue plan has claimed top prize for innovation at the SGUK awards. We find out why it’s so important.
Open-access content
jy

 The Musculoskeletal Health Toolkit

Thursday 2nd March 2023
We take a look at three recent papers to see how their findings can inform OSH.
Open-access content
6

 The latest research

Thursday 2nd March 2023
We round up some of the latest research and reports relevant to OSH professionals.
Open-access content

Latest from Manufacturing and engineering

EcoOnline webinar

 Expert analysis of HSE stats in manufacturing

In this webinar, we will take a closer look at what the new stats mean compared to previous years with a focus on the topics of chemical management, permit to work and EHS in the manufacturing industry. Book your free place now and earn CPD points, too.
Open-access content
web_Nestle-USA-headquarters_credit_Ken-Wolter_shutterstock_331412864.png

 Exclusive interview: why Nestlé was fined £800,000 for repeat incidents

Wednesday 23rd November 2022
IOSH magazine spoke to HSE inspector Bill Gilroy about a serious accident at a Nestlé factory in Newcastle – an almost carbon copy of a previous incident at another of the confectionary firm’s factories.
Open-access content
web_New-cars-at-factory_credit_iStock-1320492982.jpg

 G4S: Vehicle for change

Friday 27th May 2022
The switch to electric vehicles is changing the risk landscape for car manufacturers. We found out how G4S is protecting assembly line workers and its first responders
Open-access content

Latest from Safe systems of work

web_United-States-flag_credit_iStock-487485528.png

 Penalties mount for vehicle parts maker on OSHA’s ‘severe violator enforcement programme’

Wednesday 10th August 2022
The US Department of Labor has presented an Ohio-based vehicle parts manufacturer on its ‘severe violator enforcement programme’ with a fine of $480,240 (approx. £373,000) after inspectors found it had continually exposed workers to multiple machine hazards
Open-access content
Dyson HSE lead photo.jpg

 Dyson lands £1.2m fine after worker escapes more serious injuries

Friday 5th August 2022
Dyson Technologies has been handed a £1.2 million fine after a worker at its Wiltshire site narrowly escaped being crushed by a 1.5 tonne milling machine.
Open-access content
web_p74_Talking-Shop.png

 Talking shop: hand dominance

Friday 1st July 2022
How should organisations consider left-handedness in their safety management systems? Four industry leaders offer their thoughts.
Open-access content

Latest from Work equipment

One of the machine's two interlocks.jpg

 Ineffective control measures on industrial food mixer led to amputation and £858k fine

Thursday 12th January 2023
A Kent-based food production company has been fined £858,000 after a 26-year-old employee had to have his right arm surgically removed following an incident.
Open-access content
Web-iStock-937270176.jpg

 Chipboard manufacturer lands record £2.15m Scottish fine for fatal 90% burns

Tuesday 29th November 2022
Chipboard manufacturer Norbord Europe Limited has been fined £2.15m after a four-week trial held at Perth Sheriff Court in Scotland found that a series of failings at its Cowie site in Stirlingshire in July 2016 had led to an employee’s death.
Open-access content
P1000615.JPG

 Poor planning of floorspace led to worker’s burns death

Thursday 24th November 2022
We spoke to HSE Inspector Rose Leese-Weller about how failures in the earliest stages of planning a catering equipment cleaning facility’s shopfloor ultimately led to a worker fatality.
Open-access content
Share
  • Twitter
  • Facebook
  • Linked in
  • Mail
  • Print

Latest Jobs

Senior Health and Safety Manager

Reading
Up to £65000.00 per annum + Great Car Allowance & Benefits
Reference
5452983

Regional Health and Safety Advisor

Northampton
Up to £53000 per annum + Travel & Excellent Benefits
Reference
5452982

Global Health, Safety and Environment Director

Up to £150000 per annum + Excellent Benefits
Reference
5452980
See all jobs »

Sign up for regular e-alerts

Receive the latest news and features, free to your inbox

Sign up

Subscribe to IOSH magazine

Receive the print edition straight to your door

Subscribe
IOSH Covers
​
FOLLOW US
Twitter
LinkedIn
YouTube
CONTACT US
Contact us
Tel +44 (0)20 7880 6200
​

IOSH

About IOSH
Become a member
IOSH Events
MyIOSH

Information

Privacy Policy
Terms & Conditions
Cookie Policy

Get in touch

Contact us
Advertise with us
Subscribe to IOSH magazine
Write for IOSH magazine

IOSH Magazine

Health
Safety
Management
Skills
IOSH Jobs

© 2023 IOSH • IOSH is not responsible for the content of external sites

ioshmagazine.com and IOSH Magazine are published by Redactive Media Group. All rights reserved. Reproduction of any part is not allowed without written permission.

Redactive Media Group Ltd, 71-75 Shelton Street, London WC2H 9JQ