Blog

Latest update(s)

Why “human error” is a symptom, not a root cause

If 80% of your investigation reports (safety or quality) are attributed to “human error”, you have not identified a root cause. You have identified a scapegoat. Across many industries, investigation reports still default to the same conclusion after an incident – “the operator failed to follow the procedure”. The corrective action that follows is almost always the same, “retrain the operator”. The report is signed off. The file is closed. Management feels reassured that the problem has been addressed.

In reality, nothing of substance has been fixed.

The Executive Blind Spot – When discipline is mistaken for safety awareness

Many senior leaders still view safety failures primarily as discipline failures. The logic is simple and appealing. The process was sound. The documentation was approved. The training records were complete. Therefore, the only remaining variable must be the individual. This belief is comforting, but it is deeply flawed.

As W. Edwards Deming famously observed, the overwhelming majority of defects and failures arise from the system, not the people working within it. Blaming individuals distracts leadership from their true responsibility, which is to design, resource, and govern systems that reliably produce the desired outcome.

When executives accept “human error” at face value, they are not exercising oversight. They are outsourcing accountability.

The Operational Reality

You are confusing symptoms with causes

“Human error” is almost never a root cause. It is an outcome.

If a task can be carried out in a way that creates a foreseeable immediate or latent risk to health and safety, then the failure sits with the system, not the individual. If equipment can be assembled incorrectly and expose someone to moving parts, electrical energy, hazardous substances, or loss of containment, the design is at fault. If a safety-critical step can be missed under normal workload or time pressure, the safe system of work is inadequate. If a control relies solely on a person remembering to “be careful”, the risk assessment has failed to recognise predictable human fallibility.

From a health and safety perspective, these are not behavioural issues. They are failures of design, planning, and risk control. A competent H&S management system assumes that people will make mistakes and puts physical, procedural, and organisational controls in place so that those mistakes do not result in injury, ill health, or major incidents.

This distinction is well established in human factors engineering and accident investigation. James Reason described this clearly through his Swiss Cheese Model, which shows how failures emerge when multiple system weaknesses align, rather than because one individual makes a mistake.

Blaming the person allows weak design, ambiguous instructions, poor ergonomics, and inadequate tooling to escape scrutiny. The outcome is entirely predictable. The same failure reappears later, carried out by a different operator, under the same conditions.

The Retraining Paradox

Why retraining is an expensive illusion

Retraining is the most overused and least effective corrective action in safety management. This is not because training has no value, but because it is routinely used as a substitute for proper system design.

The cognitive limit

You cannot train a human being to maintain perfect vigilance for eight hours a day in a repetitive task. Cognitive science is clear on this point. Attention naturally fluctuates. Fatigue, distraction, and workload are unavoidable biological realities. Expecting flawless performance through concentration alone is statistically unrealistic. When organisations demand “more care” or “greater attention”, they are not managing safety. They are ignoring human limitations.

The illusion of action

Retraining is attractive to management because it appears decisive while requiring minimal investment. It avoids spending money on better tooling, automation, environmental improvements, or error-proofing measures such as Poka-Yoke, where physical or design controls are built into the process to prevent incorrect actions or to stop work before an unsafe condition can arise.

It also subtly reinforces a blame culture. Workers learn that admitting mistakes leads to scrutiny, not system improvement. Near misses go unreported. Weak processes remain hidden. This is not continuous improvement. It is organisational self-deception.

What the ISO Standards Actually Expect

Systems that tolerate human fallibility

A mature safety management system does not assume perfect humans. It assumes the opposite. International standards such as ISO 45001 and ISO 9001 emphasise process control, risk-based thinking, and the elimination of foreseeable failure modes. They implicitly recognise that people make mistakes and that systems must be designed to prevent those mistakes from becoming defects, incidents, or injuries.

You cannot fix a system problem by focusing only on behaviour. If a process depends on people never forgetting, never rushing, and never being distracted, it is not a robust process in the first place.

A challenge for Directors and Senior Leaders

  • Audit your last two years’ accident investigation reports.
  • Count how many list “human error” as the root cause.
  • If the figure is above 20 percent, reject them. Send them back.

Instruct your quality and EHS teams to remove the phrase entirely and answer a more useful question instead, such as: “Which unclear instruction, poor layout, inadequate lighting, confusing interface, unrealistic workload, or flawed tool design allowed this error to occur?”

This single change shifts the organisation from blame to learning, and from comfort to integrity.

Final Thought

Systems over scapegoats

Blaming individuals feels decisive, but it achieves nothing. Fixing systems is harder, more uncomfortable, and more expensive, but it is the only route to sustained improvement.

The choice is simple.

Are you fixing the system, or are you blaming the victim?

If your system only works when people never make mistakes, it does not work.

Back to top

One thought on “Blog

Leave a comment