Our Software Saves Lives by Predicting Workplace Injuries

What are near misses?

What are near misses? The disputed territory between lagging and leading indicators.

It seems that everyone wants to claim near misses for their cause. That should be good news – near misses present valuable opportunities to learn and improve. Unfortunately, sensible discussion on near misses can all too often degenerate into a debate about whether they are lagging or leading indicators of safety.  Turn this around though and maybe that is a clue to debunking some of the fallacious arguments involving lagging and leading indicators. Let’s explore.

Near What?

As a starting point: are we talking about a near MISS or a near HIT?

  • It was a miss, but it was a close escape – a near miss, with “near” describing the miss.
  • It nearly hit – a near hit, with “near” meaning almost.

The choice is yours. “Near miss” is conventional and well understood, that is why I choose to use it. However, having read Daniel Kahneman’s book Thinking, Fast and Slow it occurs to me that psychologically (considering cognitive ease) “near miss” may sound too good, too comfortable, too safe. “Near hit” is a less common phrase, with “hit”  more likely to trigger a mental reaction than “miss” . If so, it may draw greater attention and reaction to the same event.

Luckily nothing happened - this time

Some say that in a near miss nothing actually happened. They argue that a near miss provides a glimpse into the future – a suggestion of something more serious that might happen on another occasion. The message is that, correctly understood, a near miss is an opportunity to learn. Apply that knowledge to take action to prevent possibly more serious consequences another time. Using this argument, near misses are taken as leading indicators that can be used to help create safety.

But it was an incident

“Near misses describe incidents where no property was damaged and no personal injury sustained, but where, given a slight shift in time or position, damage and/or injury easily could have occurred” (U.S. OSHA definition). The clear message is that, despite no physical harm, something undesirable happened. On this basis a near miss is a lagging indicator.

Is a near miss an unsafe condition?

We can make a distinction between "near miss" and "unsafe condition". An unsafe condition can exist even when there is no incident – making it a leading indicator. Examples could be corrosion of steel walkways, uninspected pressure vessels, defective brakes, PPE not worn, poor electrical grounding.

Too late?

Classing near misses as a lagging indicator does not necessarily mean too late.  True you cannot go back and prevent that particular incident. But as with all incidents up to and including fatalities, it is still possible, if not an obligation, to investigate to learn from the experience and take remedial action to prevent a recurrence. In a sense the lagging indicator generated by incidents becomes a leading indicator for prevention.

Neither and both

Near misses are, quite simply, indicators. They straddle the descriptions of lagging and leading. They represent something that was unsafe (but you were lucky); they are weak signals that provide evidence in advance of the possibility for injury or damage. What matters is that near misses can be a relatively plentiful and rich source of data for learning and improvement.

We should stop worrying about the terms lagging and leading and use wisely whatever data we can to predict and improve. We miss opportunities for improvement if we get too dogmatic and say that lagging = too late, or leading = too subjective.

PDCA / PDSA

Continual improvement in safety can be achieved by using PDCA, or better PDSA:

PDCA (Plan-Do-Check-Act), also known as the Deming Cycle, is used extensively for continual improvement and has been adopted as the basis for management system standards such as OHSAS 18001.

PDSA (Plan-Do-Study-Act), also known as the Shewhart Cycle, is the version Deming taught (for more details of PDSA see Out of the Crisis and The New Economics). He advocated “Study” as this implies understanding why something improved, or not, whereas “check” suggests more an answer to whether or not all is going to plan.

PDSA can be used for continual improvement:

  • to help create what we predict will be a safe work environment  (safety precursors);
  • as the basis for an operational definition of safety

Creating safety precursors

Take time to think what is required for safe working. Decide how you will achieve it - consistently. Implement those plans. In a previous article I discussed a practical approach for effective improvement to the control of safety risks. Using PDSA can build on that.

Monitor what you do. How well are you doing on what you say you will do? Reassurance could come for example, through observations from inspections, timely completion of actions and and reporting of near misses . Call these leading indicators if you like. Given the importance of learning from near misses, action taken to encourage people to report near misses is also be an important leading indicator of safety. Note that an increase may represent greater transparency and trust, not a worsening situation. Near miss should be prevented but, whatever the incident was, people should be congratulated for REPORTING a near miss.

Complete a PDSA cycle by taking action to rectify any shortcomings in your methods of achieving safety (see Figure 1). Keep repeating the process. The message is that leading indicators help you predict and improve safety.

Image 1 - Near miss

Figure 1: PDSA on the work process with examples of leading indicators

How effective are the safety efforts?

Wisely chosen indicators will allow you to improve safety performance. But recognise that leading indicators are a heuristic – they substitute the answer to the difficult question “are we safe?” with the answer to a simpler question “are we doing what we think are the right things?”

Even if your indicators are well chosen and showing excellent performance, you may have missed one crucial failure path that could lead to an accident. This is where near misses, as well as injuries and damage, provide valuable information about weaknesses in the safety system.

So, to judge success in achieving safety in any period of time, it is not sufficient to rely solely on leading indicators. However good you may feel about the effort you have put into creating a safe workplace, the acid test is this: did anyone get injured? The final part of the jigsaw is to obtain data on injuries and occupational illnesses, for example TRIR (Total Recordable Incident Rate), or the rates for LTI (Lost Time Injuries) and DART (Days Away, Restricted, or Transferred).

Figure 2 shows how a TRIR might be shown graphically, with 3-sigma process behaviour limits calculated for a level of performance that appears to have reached a stable level of about 1.0, with annual variation predicted to be between 0.5 and 1.5.

Image 2 - Near miss

Figure 2: Example of TRIR to report safety performance

Call these lagging indicators if you like, but do not stop there. Crucially, you must study the data to understand what it means. As Nassim Nicholas Taleb says in The Black Swan “You can be very confident about what is wrong, not about what you believe is right”.

The data may be necessary for corporate reporting or other reasons, but comparing results against last year or some industry standard provides little or no knowledge. If you want to learn and improve you must study the details – specifically, your details.

Was it just luck?

If you have had no injuries, or a low rate, why was this? If you have had injuries you need to know the specifics, such as the type and severity of injuries, the type of work, the conditions and location where the injury occurred.

How much do you trust the reporting / recording process? Are thorough incident investigations identifying the cause of incidents (not who to blame)?

Is there good reporting on near misses to give a wider perspective of what is happening? Is that distorted by a disproportionate volume of trivial near miss reports to achieve a target or reward? Does fear restrict reporting on serious near misses?

The important point is that knowledge gained from studying lagging performance indicators must be fed back to improve your leading indicators and ensure that you are controlling actual safety risks. For example, if your incidents involve many soft tissue injuries you may need more focus on ergonomics or material handling. Safety checklists should be amended if necessary and perhaps new leading indicators established to track performance in these areas.

An operational definition of safety

The requirements for an Operational Definition for Safety are:

  • What is the organization's aim with respect to safety – the ideal?
  • What will the organization do to create improvements in safety – the methodology?
  • How will you know if you have achieved improved safety – the judgement?

The argument above suggests how leading and lagging indicators fit together in an integrated way as part of an operational definition of safety – a balanced approach, if you will.

The aim of continual improvement is implemented through preventive action, tracked by leading indicators that optimise performance of the safety system. Accidents provide data from which judgements can be made about the effectiveness of the preventive action. The knowledge gained from studying accidents is the basis for action to improve the preventive measures and the leading indicators. The cycle is shown diagrammatically in Figure 3.

Image 3 - Near miss

Figure 3: PDSA as an operational definition of safety

Preoccupation with Failure

Fortunately, there appears to be widespread agreement about how learning from near misses can help with safety improvement. However, we should not let our enthusiasm about the benefits of near miss reporting and investigations overshadow the point that each near miss represents A FAILURE.

In their book, Managing the Unexpected, Karl Weick and Kathleen Sutcliffe identify a preoccupation with failure as a feature of so-called High Reliability Organizations. They stress that a near miss should be interpreted as a sign that your system’s safeguards are vulnerable. Any lapse is a symptom that something may be wrong with the system: something that could have severe consequences. Their view can be compared to the common cause hypothesis that, for an incident in any one organisation, the same causal path may lead to no injury, a minor injury or a major injury.  We should, as Weick & Sutcliffe suggest, "Interpret a near miss as danger in the guise of safety rather than safety in the guise of danger”.

 Beware of people who talk about rear view mirrors

Some people love analogies. An analogy can be a useful way to explain a new concept, but remember that analogies can be fallacies. Just because there is one similarity does not mean that there are similarities in all respects. Less excusably, analogies are sometimes used to amuse an audience and divert attention from rational assessment.

“Using lagging indicators is like driving a car by looking in the rear view mirror”. Ha, ha!

What is meant of course, but conveniently missed, is that it refers to ONLY looking in the rear view mirror. Driving a car and looking from time to time in the rear view mirror warns you about cars approaching fast and about to overtake you. That improves safety.

Summary:  “Don’t throw the baby out with the bathwater”

I hope this article has shown that we learn something from both the PROCESS of creating safety (via leading indicators) and the safety RESULT (via lagging indicators).

Leading indicators can provide valuable day to day information on how PRECISE we are in doing what we say we should do. However, we should also continually revise our judgment of the effectiveness of our safety efforts by the successes and failures we experience. Lagging indicators can provide essential feedback on how ACCURATE we have been in identifying the necessary safety measures required. You just need to make sure you understand your data and not torture it to make it confess to suit some other motive.

Finally, we should learn from near misses and take action to improve. They are a performance indicator; talk of leading or lagging is a distraction from the fact that each near miss represents a failure…. and an opportunity.

References

Deming, W.E., 1986, Out of the Crisis. Cambridge, MA: MIT

Deming, W.E., 1993, The New Economics. Cambridge, MA: MIT

Kahneman, D., 2011, Thinking, Fast and Slow. London: Penguin Books

Taleb, N.N., 2007, The Black Swan: The Impact of the Highly Improbable. London: Penguin Books

Weick K.E. & Sutcliffe, K.M., 2007, Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2nd Ed. San Francisco, CA: Jossey-Bass.

 

Nick Gardener

Written by Nick Gardener

Nick Gardener spent many years working in the chemical, nuclear and automotive industries. He is now a global risk and HSE consultant working for Risk International Services Inc. He can be contacted at: ngardener@riskinternational.com

Safety Cary Blog
Ask a Safety Question
Safety Prediction Blog