The psychology of human error - why do people get things wrong?

  Trumpet| Opinion | Claire Philp, Human Applications | 04 September 2017

A man sat on a chair at work with his head in his handsEvery day news headlines implicate ‘human error’ in every kind of negative event imaginable – from environmental disasters to data breaches, plane and train crashes to missile failures. The overexposure of this term has drawn us into a sort of learned helplessness.

How many of us have found a conversation with an operator about improving safety is dismissed with some variation of ‘you can’t prevent every accident, people make mistakes’?

Following a disaster, society demands answers. Once we have ruled out mechanical and technical failure, or acts of sabotage, we tend to be left with ‘human error’. And that’s generally where it stops, in the media at least. But what does it actually mean and why does it matter?

‘Human error’ as a phrase satisfies our societal demand for somebody to blame. It is an emotive phrase that all too often serves as a sticking plaster over much more complex, deep-seated and altogether boring issues that the man on the Clapham omnibus is not concerned with. It provides the public with a conclusion, an end point to a tragic event.

We still see large companies who believe that if they could just get rid of the 5% of the work force who are too stupid or too lazy to follow instructions then errors and accidents would disappear.

But the reality is that those boring underlying issues won’t go away beneath our ‘human error’ sticking plaster. Much like an infection, if we don’t accurately identify the type of error we are dealing with and the factors that contributed, we risk treating it the wrong way – or neglect to treat it at all - and that can only mean deterioration and recurrence.

And where does the mundane fit in? We commit errors every day, but for most of us the outcome is not a large-scale disaster. In fact, we commit errors much more often than we realise, because the vast majority don’t have any measureable effect at all.

If we never revisit those occasions when we messed up but everything was fine, we don’t learn. Even worse, we develop a false sense of invincibility.

What is human error?

In the aftermath of an incident it is often evident that the frontline operator did not perform to procedure in the period of time leading to the event. Our predisposition for hindsight bias makes the link between this broken procedure and the event seem obvious and tantalisingly simple, but such admissions do not tell the whole story.

If human error simply describes not doing what the system dictates, we lump miscalculations in the same box as determined sabotage. It is clear that these distinct situations should not receive the same response – so why do we stop with the umbrella term?

What if the system was wrong in the first place? What if the system didn’t even have a rule for the situation? Is the way decision making is classified entirely dependent upon whether the outcome was favourable or not? Either way the operative did not follow procedure.

Taking Chesley Sullenberger for example; had the tip of the wing skimmed the Hudson river and broken the fuselage apart, would we still have correctly identified it as a heroic action, or would it be just another of those ‘human error’ newspaper headlines?

The situation and the judgement are exactly the same in both realities, yet what we recognise as incredible skill, calculation and heroism in one reality is dismissed as ‘human error’ in the next.

Establishing how and why the operator’s behaviour deviated from the system’s prescription can diagnose vulnerabilities in the system that are diverse and different. There are distinct categories of error which highlight very different issues calling for very different responses.

Why do we care?

When faced with the task of investigation we naturally seek out simple, satisfying answers - and in a world of interwoven systems these can be hard to come by. When we condense our findings into a widely misunderstood two-word phrase we risk missing out on many of the lessons that could prevent history from repeating itself.

As practitioners we get it. We endeavour to ensure that ‘human error’ is not the concluding statement – what kind of error are we dealing with? How will we respond to it in the short term? And how will we fortify our system to resist such errors in the future. But do our clients get it?

Those of us in industry have to ask ourselves how confident we are in the systems we have in place to identify and deal with errors correctly. How about the competence of our managers to investigate these events without falling foul to their inbuilt biases?

After all, we’re only human.

Find out more in an interactive session exploring error classification at the National Safety & Health Conference on 14 September at Nottingham Belfry.

Thanks for reading Connect, and if you have any stories to tell or opinions to share, please email connect@iosh.com.

Safety experts call for 'considerable step change' post-Grenfell
IOSH at the World Congress on Safety and Health at Work in Singapore
Guidance on including health and safety performance in annual reports
See it, Say it, Sorted