Our Passion is Making Patient Safety Real.

Principles of Patient Safety


 

Anesthesia Crisis Resource Management (ACRM) – ACRM was established at Stanford University in the 1990s by David Gaba, MD and colleagues. ACRM addresses issues of human performance and patient safety in anesthesia. Anesthesia providers are often faced with situations that can escalate into critical incidents due to the continued complexity of our ever-changing work environment. ACRM describes the role of environmental, personal, and other factors in the creation of critical and adverse events. ACRM training provides the anesthesia provider with some behavioral and intellectual recommendations to manage the risks more effectively

– Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

Attention allocation- Attention allocation is key in crisis resource management and part of the ERR WATCH acronym. Providing anesthesia care is a challenge in itself due to its dynamic nature. Patient conditions may change in overt or subtle, less obvious, ways. These changes may occur in an abrupt or slow manner. The anesthesia provider is continuously monitoring several parameters; however, not all parameters need the same level of attention at all times. Part of crisis resource management is recognizing that the anesthesia provider’s attention is a limited resource. It is important to learn how to best allocate one’s attention in continuously changing environment.

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

Closed- Loop Communication– Closed-loop communication involves the speaker directing an order to a specific individual by name and that individual repeating the message back for validation. Not only does the speaker then know that the message was received but the speaker also can confirm that it was received correctly. In critical events, the CRNA should communicate what has happened, what treatments are being initiated as well as the results, all of which being factual statements.

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

Complex systems– Complex systems are composed of several components or pieces that are interrelated in different kinds of ways and are open systems. They are continuously changing when interacting with the environment and lack clear cut boundaries. It may be difficult or even impossible to decipher where the system ends and where the environment begins.

-Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.

 

Environment– Each anesthesia provider should be well acquainted with his or her work environment. It is important to not only to know one’s immediate environment but also where the emergency equipment is located and how to use it. This includes being able to trouble-shoot malfunctions and knowing where and how to find a replacement if needed. One’s environment also includes OR personnel and how they affect the anesthesia provider’s duties. Knowing one’s work environment includes having awareness of how internal and external influences may affect the patient.-

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

ERR WATCH– ERR WATCH is an acronym developed by Joanne L. Fletcher, CRNA, EdD to translate the principles of ACRM from the nurse anesthetist’s perspective. Each letter stands for a variable contributing to ACRM and allows for a quick review for the nurse anesthetist. ERR WATCH stands for Environment, Resources, Reevaluation, Workload, Attention, Teamwork, Communication, and Help. All of these ideas are important to understand and be familiar with to improve one’s management of a critical situation.

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

Expectations– An expectation is something that one envisions, most often in positive ways that is reasonable to come about. When an individual expects something, he or she is usually mentally prepared for this event to occur. The actions that one consciously makes are based on assumptions about how one’s surroundings will be affected by what one does. These expectations usually focus the individual’s attention to certain aspects of an event. Therefore, it affects what is noticed, considered, and remembered.

-Weick K, Sutcliffe K. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Fransisco, CA: Jossey-Bass of John Wiley & Sons; 2001.

 

Fixation errors- Fixation errors develop from continued failure to reevaluate an incorrect diagnosis. The anesthesia provider may think ‘everything but this’ which is a fixation error that disables a CRNA from committing to a definitive treatment, even though there is great evidence that a negative event is occurring. Instead of tending to the ominous event unfolding, the CRNA moves from one activity to another without taking action. Another type of fixation error includes focus on a minor aspect of a greater problem and fails to treat the overall problem. The last type of fixation error includes denial of a potentially catastrophic event. In turn, the CRNA fails to treat the problem leading to an ominous outcome.

–Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

High reliability organizations- High reliability organizations are groups that operate under complex and trying conditions at all times and are still able to manage to have fewer accidents than expected. Some of these organizations include power grid dispatching centers, air traffic control systems, nuclear aircraft carriers, nuclear power generating plants, hospital emergency departments, and hostage negotiation teams. Often, many of these organizations don’t fail even though they are faced with numerous unexpected events. These organizations face a great deal of unexpected events because of the complexity of their technology and the individuals working in these conditions have varied demands. Additionally, the individuals who run these high reliability organizations, like most people, have an incomplete understanding of their own systems and what they face.

-Weick K, Sutcliffe K. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Fransisco, CA: Jossey-Bass of John Wiley & Sons; 2001.

 

Hindsight bias– Hindsight bias is a type of persuasion of thoughts based on known facts about the outcome of a series of actions and events. Once an individual has knowledge about how the outcome came about, he or she begins to simplify the causal history that led up to the outcome. By doing so, the individual intensifies the significance of the information or the single action that may have changed the series of events including the outcome. The ‘messiness’ or complexity of the situation is often ignored as well as many other things that demand the anesthesia provider’s attention. The fact that it wasn’t known how things were going to turn out is also often disregarded.

-Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.

 

Human Error– All humans, by nature, make mistakes. It is therefore not reasonable to expect error-free performance. Human error occurs from the physiological and psychological limitations of humans. Some errors are caused by factors such as fatigue, workload, and fear, as well as cognitive overload, poor interpersonal communications, inaccurate information processing, and faulty decision making. “Human beings by their very nature make mistakes; therefore, it is unreasonable to expect error-free human performance.”

– Helmreich RL. On error management: lessons from aviation. BMJ. 2000; 320:781

 

Human factors– Human factors take environmental, organizational, job factors and individual characteristics into account. It is how these factors affect behaviors at work in a way that may influence health and safety. In simple terms, humans factors accounts for the job, the individual, and the organization as well as how they influence people’s health and safety-related behavior.

-World Health Organization: Patient Safety. World Health Organization Website. http://www.who.int/patientsafety/research/methods_measures/human_factors/en/. Updated 2014. Accessed March 6, 2014.

 

 Local Rationality Principle– The local rationality principle declares that individuals act in ways that make sense to them based on the goals they are pursuing, the knowledge they have available about the problem and the direction of their attention at the moment.

-Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.

 

Mindfulness– Mindfulness is a combination of continuous evaluation of current expectations, the application of new and different experiences, the ability and willingness to make sense of unprecedented events, appreciation of the situation and ways to deal with it, and the ability to recognize new dimensions of the situation. Mindfulness improves foresight and current functioning.

-Langer E. Minding Matters: The Consequences of Mindlessness-Mindfulness. In: Berkowitz R, ed. Advances in Experimental Social Psychology Volume 22. San Diego, CA: Academic Press; 1989: 137-168.

and

-Weick K, Sutcliffe K. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Fransisco, CA: Jossey-Bass of John Wiley & Sons; 2001.

 

Mitigation– Mitigation is the reduction in seriousness or severity of a situation. According to Dekker (2011) there are six levels of mitigation. The levels successively decrease in the power of persuasion to another practitioner that what is being is said is of value or importance. The levels of mitigation include commands, joint obligation statements, joint suggestions, quarries, preferences, and hints. These levels relate to behaviors that we as humans as a group have evolved to ease social interactions in a way to ensure acceptance and even survival in social structures. These mitigating techniques are entirely inappropriate in complex situations involving a great deal of information overload and decision ambiguity of quickly developing anomalies and uncertain outcomes.-

-Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.

 

Near miss- A near miss may be described as a successful outcome in which chance played a vital role in averting failure. It occurs when a negative event could have happened, for example due to hazardous condition, but did not. These situations offer evidence of a system’s resilience as well as its vulnerability. Even though the overall outcome may be viewed as a success, a near miss is a failure in other regards. However, consistent with outcome bias (Baron and Hershey 1988), it can be said that decision makers will evaluate a near miss like a success based on the ultimate outcome. Therefore, we expect a near miss to be assessed in the same manner as a success and significantly different from a failure, even though the near miss would have been a failure except for the component of chance.

-Dillon RL, Tinsley CH. How Near-Misses Influence Decision Making Under Risk: A Missed Opportunity for Learning. Management Science, Articles in Advance. 2008; 1-16

 

Normalization of deviance- Normalization of deviance is a gradual process of accepting practices or standards that are unacceptable. When repeated deviant behaviors lead to outcomes that appear normal or lack catastrophic events, it becomes the social norm for an organization. Normalization of deviance is a process that repeats itself. If seemingly successful outcomes continue to occur, individuals are given the impression that the risk is “under control.” This seemingly successful outcome leads to further risky or deviant decisions that set in motion a steady progression towards increased risk. With continuous steps away from the previous norm without negative outcomes and no observed sacrifice of safety, one creates bias that nothing catastrophic will happen and continues to depart a little further from the norm.

-Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.

 

Person approach– The person approach is the previously honored principle that placed the focus on the unsafe acts, errors, and procedural violations by those individuals at the “sharp end:” nurses, physicians, surgeons, anesthetists, pharmacists, as well as other professionals. This approach views unsafe acts as occurring from erroneous mental processes such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness. Those who follow this thought tend to treat errors as a moral issue, assuming that bad things happen to bad people. Psychologists refer to this as the just world hypothesis.

-Reason J. Human Error: Models and Management. BMJ. 2000; 320:768

 

Production Pressure– Production pressure refers to actual or internalized pressures or incentives on an individual to prioritize production over safety. There are at least two negative effects from production pressure that in turn reduce safety. Primarily, production pressure can push individuals to take part in ‘violations.’ These violations are deliberate deviations from safe practices (which are designed by managers and regulatory organizations) to a potentially hazardous practice. The second negative outcome from production pressure is the creation of haste, which may increase the likelihood of unintentional errors in judgment and performance. Not only can these aspects of haste create error-induced environments, it can interact with other performance shaping factors such as fatigue.

-Gaba D, Howard S, Jump B. Production pressure in the work environment: California Anesthesiologists’ attitudes and experiences. Anesthesiology. 1994; 81(2):488-500

 

Resources- Even though anesthesia providers are their own primary resource, it is important to be aware of additional resources available. It becomes extremely difficult to keep track of more than two or three rapidly changing parameters or perform more than one physical task at a time with continuous accuracy. Most emergencies require more than one set of hand and eyes. It is important for the CRNA to recognize when the workload becomes unmanageable, and use the resources immediately available in order to manage a changing situation. The circulating nurse and surgical team can readily assist the CRNA while help is summoned. Other individuals such as medical students or surgical technicians can perform certain tasks that don’t require training such as obtaining equipment and transporting specimens to the laboratory. Surgeons may also offer expertise in regards to bleeding or other surgical emergencies.

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

Risk of Complexity– Increased complexity can lead to increased risk of failures. Barriers are created to prevent a progression of errors and failures prior to contributing to an adverse event. There are many redundant mechanisms, safety systems, guidelines, policies and procedures designed to keep systems from failing in ways that produce bad outcomes. However, these barriers and redundancy, whether they are physical obstructions, or polies/procedures can add complexity to the system and increase the overall risk.

– Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.

 

Situation Awareness- Simply stated, situation awareness is knowing what is going on around you. The notion of what is important is inherent to situation awareness.

A general definition by Endsley (1988) is “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.”

-Endsley MR, Garland DJ. Situational Awareness Analysis and Measurement. Mahwah, NJ: Lawrence Erlbaum Associates; 2000.

 

  1.  Situational Awareness Level 1– Perception- Perception of important information is needed to help increase the likelihood of formulating an accurate understanding of the situation.
    – Endsley MR, Garland DJ. Situational Awareness Analysis and Measurement. Mahwah, NJ: Lawrence Erlbaum Associates; 2000.
  2. Situational Awareness Level 2– Comprehension- Situation Awareness includes how people gather, interpret, store, and remember information. According to Flach (1995) “the construct of situation awareness demands that the problem of meaning be tackled head-on. Meaning must be considered both in the sense of subjective interpretation (awareness) and in the sense of objective significance or importance (situation).”
    -Endsley MR, Garland DJ. Situational Awareness Analysis and Measurement. Mahwah, NJ: Lawrence Erlbaum Associates; 2000.
  3.  Situational Awareness Level 3– Projection- Projection is the ability to predict future events and dynamics. This ability to predict future occurrences allows for appropriate and timely decision-making.
    -Endsley MR, Garland DJ. Situational Awareness Analysis and Measurement. Mahwah, NJ: Lawrence Erlbaum Associates; 2000.

 

Swiss Cheese Model of system accidents– The Swiss Cheese Model of system accidents was created by James Reason to exemplify how negative events can occur.

High technology systems, such as those used in the operating room, have many defense layers. Some are engineered such as alarms, physical barriers, or automatic shutdowns. Others rely on people such as surgeons, anesthetists, pilots, or control room operators, while others rely on procedures and administrative controls. The function of these systems is to aim to protect potential victims and assets from hazards. Ideally, each defensive layer would work flawlessly. However, these defenses are more like slices of Swiss cheese with many holes that are continually opening, closing, and moving in location. The holes in any one slice of cheese would not usually equate to a negative outcome; however, if the presence of holes in many layers of cheese momentarily align, a trajectory of accident opportunity is created leading to hazards coming in contact with victims.

-Reason J. Human Error: Models and Management. BMJ. 2000; 320:768

 

System approach– In contrast to the person approach, the system approach is the new theory that is based on the fact that humans are fallible and errors are expected to occur even in the best organizations. Errors are viewed as consequences rather than causes, having their origins not necessarily in the willfulness of human nature as in the ‘upstream’ systemic factors. According to the system approach, when an error occurs, it is not important to know who made the mistake, but how and why the defenses failed. Defenses, barriers, and safeguards play a key role in the system approach.

-Reason J. Human Error: Models and Management. BMJ. 2000; 320:768

 

Teamwork– Working as a cohesive group is essential during times of crisis situations. Critical events in medicine are often complex and require immediate intervention of multiple treatments in order to have a successful outcome. The best way to pmote a successful outcome in regards to a team and its members is to involve the efforts of several people under the direction of a leader.

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

Workload- The anesthesia provider’s workload often changes at different times during a case. The CRNA should use ‘low workload’ periods to finish tasks that may have been delayed or prepare for anticipated high workload periods. Many CRNAs already use the low workload period after incision to catch up on charting or preparing supplies and drugs for emergence before they may be needed. The CRNA should also keep track of priority tasks. It is important to remember that in the event of an emergency situation, many maintenance tasks can be postponed until after treatment has been initiated. It is also important to use one’s resources and obtain assistance and delegate tasks, as the mental and physical workloads during critical events can be overwhelming. The additional assistance will allow the CRNA to have time to disengage from the situation and gain a more objective view.

-Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.

 

References

  1. Dekker S. Patient Safety: A Human Factors Approach. Boca Raton: Taylor & Francis Group; 2011.
  2. Dillon RL, Tinsley CH. How Near-Misses Influence Decision Making Under Risk: A Missed Opportunity for Learning. Management Science, Articles in Advance. 2008; 1-16
  3. Endsley MR, Garland DJ. Situational Awareness Analysis and Measurement. Mahwah, NJ: Lawrence Erlbaum Associates; 2000.
  4. Fletcher JL. AANA Journal Course: Update for nurse anesthetists- ERR WATCH: Anesthesia crisis resource management from the nurse anesthetists perspective. AANA J. 1998; 66(6):595-602.
  5. Gaba D, Howard S, Jump B. Production pressure in the work environment: California Anesthesiologists’ attitudes and experiences. Anesthesiology. 1994; 81(2):488-500
  6. Helmreich RL. On error management: lessons from aviation. BMJ. 2000; 320:781
  7. Langer E. Minding Matters: The Consequences of Mindlessness-Mindfulness. In: Berkowitz R, ed. Advances in Experimental Social Psychology Volume 22. San Diego, CA: Academic Press; 1989: 137-168.
  8. Reason J. Human Error: Models and Management. BMJ. 2000; 320:768
  9. Weick K, Sutcliffe K. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Fransisco, CA: Jossey-Bass of John Wiley & Sons; 2001.
  10. World Health Organization: Patient Safety. World Health Organization Website. http://www.who.int/patientsafety/research/methods_measures/human_factors/en/. Updated 2014. Accessed March 6, 2014.