best free html templates

Medical Error Prevention: Don’t Miss the Forest for the Trees 


Because latent failures occur more frequently than active failures, isolating active failures as the sole target in error prevention is not an effective solution.

Historically, solutions to avoidable errors in healthcare were solely focused on improving the skills of the individual provider. Even now, the idea that errors can be prevented if the offending individuals are more skilled, more attentive, more motivated, and more thorough permeates not only the healthcare culture, but also popular opinion. While this focus on individual skill is not wrong, a comprehensive approach to preventing treatment errors requires providers to also detect and prevent systemic weaknesses.[1]

Why?

The Institute of Medicine (IOM) found that the human error research of sociologist, Charles Perrow, and psychologist, James Reason, which is used in high-risk complex organizations including aviation and nuclear power, is also applicable to the healthcare industry. Reason explains that complex systems usually fail because of several weaknesses in the process chain. While these weaknesses, which he terms “latent failures,” do not individually cause damage, together they lead to damage at the end of the process chain. When a patient injury is discovered, attention is easily drawn to the final failure that causes the injury, usually an individual provider’s actions or inactions. Reason calls the easily identifiable failures at the end of the process chain, “triggering events” or “active failures.”

"The active error is that the pilot crashed the plane. The latent error is that a previously undiscovered design malfunction caused the plane to roll unexpectedly in a way the pilot could not control and the plane crashed."

Because latent failures occur more frequently than active failures, isolating active failures as the sole target in error prevention is not an effective solution.[2]

Modern healthcare is increasingly more complex. While advances in technology and medicine allow for increased collaboration between institutions, specialists, and professional groups, it also creates a more complex supply process for treating patients. In response to the IOM’s recommendation that human error research be implemented in modern healthcare risk management, Germany’s Action Alliance for Patient Safety (APS) with strong support and participation from the medical profession, including the Berlin Chamber of Physicians, has made substantial progress toward learning and changing on a systemic level. Central to the success of systemic learning in Germany is the creation of a “trustworthy learning and safety culture” that embraces sanction-free reporting in healthcare institutions.[3]

[1] S. Barth, Aus Fehlern lernen –Schwachstellen imSystem rechtzeitig erkennen. Berliner Ärtze ((Dec. 15, 2016, 3:25 p.m.) https://www.aerztekammer-berlin.de/10arzt/40_Qualitaetssicherung/50_Patientensicherheit/Artikel_BAE_1_2009_Patientensicherheit.pdf.

[2] L. T. Kohn, J. Corrigan, & M. S. Donaldson, To err is human: Building a safer health system. National Academy Press (Dec. 15, 2016, 3:25 p.m.), https://www.nap.edu/read/9728/chapter/5#50. See also Barth, supra.

[3] Barth, supra.