Understanding “Human Error”

Humans make mistakes. Any system that depends on perfect performance by humans is doomed to failure. In fact, the risk of an accident is more a function of the complexity of the system than it is the people involved. Humans are not the weak link in a process. We are a source resilience. We have the ability to respond to unpredictable inputs and variability in the system. The contents of this post are based on the work of Sydney Dekker in his book “The Field Guide to Understanding Human Error.”

Professor Dekker is a pilot and human factors engineer. Most of his work comes from analyzing industrials accidents and plane crashes. One such crash was the accident in May 1977 where one jet rammed into another killing 583 people. 

Now we can blame the pilot for the crash. Had the pilot performed better, this accident could have been avoided. If we remove such bad apples, the system works fine. 

However on deeper inspection there were multiple causes (non-standardized language, bad weather, overly crowded runway, equipment issues, etc). It was not simply “human error” that caused this crash, but a series of problems. Understanding all these causes reveals that pretty much any pilot could have made this mistake. The system needs to change to promote pilot success. 

Casting blame makes us feel like we’ve offered an appropriate response to a terrible event. However blaming does not improve the system so the next person doesn’t make the same mistake. In order to learn from our mistakes, we need to understand why they happened. 

Local Rationality and Just Culture

No one comes to work wanting to do a bad job. 

Sydney Dekker

The local rationality principle asks us to understand why an individual’s action made sense at the time. “The point is not to see where people went wrong, but why what they did made sense [to them].” We need to understand the entire situation exactly as they did at the time, not through the benefit of retrospection. 

We balance the need to keep people accountable while acknowledging that most adverse events are not due to “human error.” We emphasize learning from mistakes over blaming individuals. We need zero tolerance for blameworthy events like recklessness or sabotage while not unfairly blaming individuals for system problems. 

Just Culture Algorithm

The Just Culture algorithm asks a series of questions to determine the cause of an adverse event and offers an appropriate response. If an act was a deliberate act of sabotage, then severe sanctions are necessary. If reckless behavior led to the adverse outcome, the individual should be held accountable. However if the any individual’s actions in the same context could have led to the same result, then it is hardly fair to blame that person. 

  1. Did the individual intend to cause harm? Did they come to work in someway impaired? This is sabotage.
  2. Did the individual do something they knew was unsafe? This is reckless behavior.
  3. Does the individual have a history of similar events with similar root cause? This person is not learning from prior mistakes.
  4. Would three peers have made the same mistake in similar circumstances? This passes the substitution test. It is a no blame error. 

Analyzing Adverse Events

The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.

Dr. Lucian Leape

The old school format of Morbidity and Mortality conferences pit the person who made the error against a room full of experts with the benefit of hindsight. This adversarial arrangement encouraged people to hide their mistakes. We needed a new approach if we wanted to encourage bringing errors into the light for analysis to learn from these mistakes. Dekker describes six steps. 

Step One: Assemble A Diverse Team

The team should include as many stakeholder perspectives as are pertinent. In medicine, we would include physicians, nurses, technicians, patients and others. This team needs to have expertise in patient care (subject matter expertise) and in quality review. The one group not included are those who were directly involved in the adverse event. Their perspective will be incorporated through interviews, but they do not participate in the analysis. 

Step Two: Build a Thin Timeline

In airplane crashes, investigators recover the flight recorder (black box) to create a timeline of events during the flight and conversations between parties. In medicine, we look at the chart to understand what happened and when. This is a starting point, but excludes the context needed to understand local rationality. 

Step Three: Collect Human Factors Data

Interview the people directly involved in the adverse event to understand what happened from their point of view. This is best done as early as possible as memory tends to degrade with time. Understand what was happening in the room, why did they make the choices they did, and what was their understanding of the situation and why. 

George Duoros presents a series of questions on the EMCrit Podcast to guide the collection of this human factors data. 

Collecting Human Factors Data (George Duoros)

Step Four: Build a Thick Timeline

With the human factors data in hand, overlay this on the thin timeline to build a thick timeline. This presents the events as they occurred within the context under which the providers were working. You may need to go back to interview providers until you can understand what happened as they understood it at the time. Then we achieve local rationality. 

Step Five: Construct Causes

We don’t find causes. We construct causes from the evidence we collect. The causes of the error are complex and are not readily available to be discovered. We need to work to understand and propose possible causes. One method of organizing the causes is in a Ishikawa diagram (or fishbone diagram). 

Ishikawa (fishbone) diagram to analyze potential causes of adverse events. The adverse event is placed at the fish’s head on the right. Off the spine are potential areas where errors may arise. From each rib, place the potential error and supporting details. 

Step Six: Make Recommendations

Brainstorm for potential solutions that would prevent others from having the same outcomes. Ideally recommendations are worded such that they are specific, measurable, achievable, relevant and time-bound. 

Final Thoughts

Remember that information is protected. It includes patient data and as such is protected under HIPAA. Do not put it in publicly available platforms such as Google Slides or Zoom. 

Additionally, the entire quality improvement process should be a safe space to encourage providers to examine their errors. As such, it is protected under the Patient Safety and Quality Improvement Act of 2005 (Public Law 109-41), signed into law on July 29, 2005. Use an approved slide template which includes the appropriate language, for example: 

This document is privileged and confidential under the Illinois Medical Studies Act and should not be shared or distributed other then through the Quality Assurance Committee structure.

Dr. Douras recommends the following agenda for a 30 minute M&M case: 

  • Introduction: Remind the group that this is about learning and identifying systemic problems, not about blame & shame.
  • Present the thin and thick timelines: this should take about 10 minutes, excluding extraneous information. It can be presented by a junior resident, but they would need the support of a senior facilitator to keep the discussion on track. 
  • Discuss the case: identify potential causes possibly using a fishbone diagram with the group. This should also last only about 10 minutes
  • Look for systemic problems and solutions: the goal of the exercise is to identify potential solutions that would prevent a similar mistake from happening again. The bulk of the time should be spent in this section: 10 to 15 minutes

References

  1. Sydney Dekker’s “Field Guide to Understanding Human Error”
  2. Angels of the Sky: Dorothy Kelly and the Tenerife Disaster
  3. EMCrit 249 – You Can Either Learn or You Can Blame – Fixing the Morbidity and Mortality Conference with George Douros
  4. The Patient Safety and Quality Improvement Act of 2005

Leave a Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s