Skip to content

Why don't we learn from healthcare incidents?

Updated at: 11 June 2020

Table of contents

Learning from incidentens

We try to prevent incidents or calamities by estimating prospective risks. In the case of an adverse event or a near miss, it is important to determine retroactively how an incident of this nature took shape. Various analysis methodologies can be used to facilitate this process, such as the PRISMA and RCA methodology. Everybody can benefit from this kind of incident analysis so that it can be prevented in the future. Yet this is exactly where it often goes wrong…

This blog describes the 3 crucial situations in which learning from incidents can go wrong. Next, the blog shows how to recognise the signals and react to them. 

Learning from incidents can go wrong in 3 crucial situations

There are various ways to examine the origin of a mistake. Having said that, it is useful to realise that the response to an incident can take place in various ways. Subconscious convictions held by caregivers about the imperative of faultless execution is not the only issue at play. There are three crucial situations when learning from incidents can be derailed, namely:

1. Human response by incidents

People overestimate their own abilities at the very moment they make a mistake. The root cause is often sought in external factors. A doctor with a positive self-image who makes a mistake can, for example, attribute the error to an inadequate handover by a colleague. This is also known as cognitive dissonance; forgetting to take your own shortcomings in an incident into account.

2. Work pressure in healthcare

The healthcare sector is a complex community in which working with protocols cannot cover all the risks. In addition, it is not always possible to work according to protocol. Professor Klein, specialised in the science of safety, recalls the example of anaesthetists who, according to a WHO directive, must wash their hands every time they touch the computer or bed of a patient. The day-to-day reality means that you would need to wash your hands 145 times an hour.

Care professionals do not have a lot of time on their hands and therefore tend to conceive self-serving solutions to work around hurdles. These are then not shared with colleagues, because they need to be executed quickly and are not always effective.

3. Jumping to conclusions

We sometimes tend to draw conclusions too quickly. Take the example mentioned above. Should an incident take place, the following conclusion can be drawn; “it happened because the hand-wash protocol was not followed”. This inference however takes no account whatsoever of indirect factors within the organisation such as a culture of fear, austerity measures and performance pressure. Assessment of the majority of incident analyses does not indicate what actually happened or what exactly the circumstances were at that moment. Measures are subsequently taken without thorough examination of the root causes. Care professionals in turn do not always know who issued the measure or why it was imposed. A lost opportunity, because nobody learns from the mistake either.

Recognize the inability to learn

Certain signals give away the behavioural patterns discussed above, namely:

  • Remarks such as “we have done it like this for years and nothing has ever gone wrong” or “that would never happen to us, because we do it differently around here”.
  • When staff are reluctant to execute incident analysis.
  • When staff express no interest in the outcome of incident analysis.
  • When staff do not recognise the value of incident analysis.
  • When what actually happened does not emerge from the incident analysis.

"A thorough analysis of a few incidents is to be preferred to superficial registration of as many incidents as possible."

Incident analysis to realize awareness

If a healthcare provider is aware of these behavioural patterns, the signals can then be picked up. Behaviour is often determined by subconscious and automatic response. A careful and thoroughly-executed incident analysis can create awareness about the origin of incidents.

>> Want to know more about data analysis? This page describes five methods to analyze incidents and risks.

It is important to make incident reporting as safe and accessible as possible. When analysing an incident and reviewing the combination of circumstances that took place, it often becomes apparent that there are a number of weaker links along the process chain. This provides a lot of information that can be discussed openly with one another. The weaker links in the process can then be addressed on the basis of this information. This is much more effective than rowing against the flow of the above-mentioned signals.

How can TPSC help in this endeavour?

TPSC software supports the simple and approachable reporting of incidents. The nature and severity of the incident is immediately registered in the notification. Various analysis methodologies put the root causes of the origin of an incident on the radar screen. Weaknesses in existing processes are highlighted, analysed and optimised. This improves the awareness of how an incident came into being. The standard solutions can be adjusted to fit the specific requirements of the healthcare provider.

How can a contribution be made to the imperative of learning from mistakes made?

  • Quick and easy online incident reporting
  • Analysis with the help of RCA, PRISMA or HFMEA
  • Root causes are classified
  • Trends are made clear and visible
  • Improvement measures are embedded with the help of our Improve 2.0 application
  • Dashboards and reports make it easy to share information and create awareness.

Like to read more about learning from incidents?
Download the eBook 'Incident Management, in search of continuous healthcare improvement'.