Mental Traps That Block Learning from Incidents
Incidents happen — and afterwards, it usually seems clear what went wrong. Or does it?
In reality, it’s often not that simple. When we try to learn from incidents, invisible obstacles stand in our way: our own thinking errors.
Heuristics and Biases
Mental shortcuts and distorted perceptions help us make quick decisions in everyday life. But when it comes to learning from incidents, they often lead us down the wrong path.
Four thinking traps are particularly important to watch out for:
Fundamental Attribution Bias: “They were just careless”
After an incident, our first instinct is often to blame the individuals directly involved.
“They should have paid more attention!”
What we tend to overlook is that decisions are never made in isolation. Time pressure, poor communication, and unsafe working conditions — all of these factors influence behavior.
If we focus only on individual mistakes, we miss the chance to fix the system itself. And that’s exactly what sustainable safety depends on.
False Uniqueness Bias: “That could never happen to us”
When incidents happen elsewhere, many organizations quickly think: “Well, they made mistakes — but we’re better than that.”
This is the False Uniqueness Bias: the belief that we’re smarter, better prepared, or less vulnerable than others.
But that mindset blocks learning.
Instead of benefiting from others’ experiences, we lull ourselves into a false sense of security.
The real question should be: “Where might we be just as vulnerable?”
Confirmation Bias: “It’s obviously human error”
We love being right. That’s why we tend to seek out information that confirms our initial assumptions — and ignore anything that challenges them.
In incident investigations, this often plays out when the idea of “human error” is introduced early on. From that point, every fact tends to be interpreted through that lens.
The problem? If we don’t seriously consider evidence that doesn’t fit our assumptions, we miss critical insights.
Real learning requires openness — and the courage to question our own thinking.
Hindsight Bias: “Everyone should have seen that coming!”
After an incident, everything suddenly seems obvious: “Why didn’t anyone realize this would happen?”
That’s hindsight bias at work. Looking back, everything appears clear and predictable.
But in the real moment, things are often complex, confusing, and full of uncertainty.
Sidney Dekker’s famous tunnel example illustrates this perfectly:
After a fatal tunnel collapse, it seemed obvious that no one should have been working there.
But for those involved at the time, under pressure and without clear warning signs, the decision to continue seemed completely reasonable.
If we ignore this reality, we lose valuable learning opportunities — the chance to understand why capable people sometimes make decisions that lead to bad outcomes in complex systems.
Conclusion
Heuristics and biases are part of being human — we all fall into these traps.
The key is to recognize them and actively push back:
Instead of looking for someone to blame: Understand the context.
Instead of thinking “that wouldn’t happen to us”: Honestly assess our own vulnerabilities.
Instead of sticking to first assumptions: Stay open to complex explanations.
Instead of playing the critic with hindsight: Ask how the situation actually appeared at the time.
Only then can we turn incidents into real opportunities to learn and improve.
Danke! Für das Originalfoto an Randy Laybourne auf Unsplash
Weitere Beiträge:

The Vicious Cycle of Blame
The blame cycle describes a destructive dynamic in companies where employees are held responsible for mistakes.