navbar

Why It’s Hard To Learn From Near Misses

by Anthony Jones

HSEQ Safety VUCA

The COVID-19 epidemic as a clear example


Near misses happen all the time, but it seems really hard to learn from them, both at personal and organisational level. There might be reasons in how the human brain works that explain this. 

If we look at the Covid-19 crisis, there are clear examples of some countries, like the United States and the United Kingdom where the experiences of near misses were ignored. They saw what happened in South Korea, for example, and still decided to act the way they did.

What might partially explain this is the outcome of the near miss South Korea experienced — quick containment and hardly any deaths. This is due to a cognitive bias, called the outcome bias. A great explanation to this is the airline example: If two planes nearly crash into each other, but luckily miss, the pilots might think they saved the day due to their piloting skills, instead of having a major investigation done to sort the systemic errors that caused the near miss.

As the outcome of a near miss is not having an accident, our bias is to assess the actions that led to it as better than if an accident had happened.

The Outcome bias was first described by Jonathan Baron and John Hershey in 1988. The volunteers they studied were asked to rate other people's decision-making and reasoning in uncertain situations, such as gambling. They gave better scores if the outcomes were favourable — even though chance played a large role in the outcomes. (1)

“As the outcome of a near miss is not having an accident, our bias is to assess the actions that led to it as better that if an accident would have happened.”

There is another bias that affects our view of near misses, the optimism bias.

The first study that revealed the optimism bias, was done by psychologist Neil Weinstein in 1980. He found that when it comes to people’s view about their future prospects, they are “unrealistically optimistic”. (2)

In the study, Weinstein asked over 200 students to rate their chances of experiencing positive or negative life events. Then the students rated the chances of other people in the group experiencing the same events. Not surprisingly, most of the students thought they had better than average prospects. Similar studies have been done regarding the cognition of how smart people think they are — most people think they are above average.

Conclusion

It is easy to imagine how the optimism bias could affect our view of near misses. And of course our view of Covid-19, too. With the outcome bias thrown into the mix, it’s even easier to understand why near misses are rarely taken seriously enough. The bottom line is, that organisations should understand how people work and take it into account when analysing and acting on the near misses they have.

 

Plan Brothers provides Near Miss Reporting and Prevention Tools. If you would like to learn more, book a demo with us or try our 30-days free trial.

 

References

  1. Baron, J., & Hershey, J. C. (1988). Outcome bias in decision evaluation. Journal of Personality and Social Psychology, 54(4), 569–579
  2. Weinstein, N. D. (1980). Unrealistic optimism about future life events. Journal of Personality and Social Psychology, 39(5), 806–820

We're a tech company with a passion for helping our customers adapt to the fast changing VUCA world. We're doing that by developing easy-to-use SaaS products that make gathering, managing and analysing field information as easy as possible for the end users. Remove gatekeepers, go horizontal and learn from your mistakes before they actually happen. More info at planbrothers.io.

HSEQ Safety VUCA

Anthony Jones

Chief Operating Officer