Patient Safety And Quality

By Gerrit Van Wyk.

Surgeons are bomber pilots.

A Boeing Model 299 bomber prototype, later the famous B-17 Flying Fortress, crashed during a demonstration at Wright Field in Dayton, Ohio, on October 30, 1935. At the following inquest it was found the crash was due to pilot error because the cockpit had become too complex. To keep the project going and save the company, Boeing’s pilots designed a checklist to ensure nothing is forgotten before takeoff. There were no further serious incidents after that, and checklists became routine during flying.

The headline of a 2000 publication by the US Institute of Medicine (IoM), To Err is Human, was an estimated 98,000 people die in US hospitals every year from medical errors. Instead of scrutinizing the report and fact checking, the media let loose the idea doctors are incompetent cowboys, and hospitals death traps. Just like that, the conditions were created for a quality movement and opportunity for change managers.

Since a lot of surgeries which can go wrong are performed in hospitals, the World Health Organization (WHO) started a Safe Surgery Saves Lives project in 2007. Based on the idea a surgical procedure is much like flying a B-17 bomber, the group performed a study, sponsored by the WHO, on the effectiveness of a preoperative surgical checklist in reducing surgical mistakes. The publication was fast-tracked for publication in a prestigious medical journal, and, based on it, surgical checklists soon became mandatory in 12 countries. The fact the study had some methodological issues was glanced over.

Unfortunately, like most change initiatives in health care, neither the theory nor the study results were properly checked and thought through, and change management followed the usual formula taught in business schools. Opinion leaders did the rah-rah and when that didn’t work, the authoritarian hammer of mandating came down. As of today, there is no credible evidence surgical checklists significantly save lives. Having soothed our need to act on the “shocking” Institute of Medicine report, consultants were paid handsomely and moved on to the next flavor of the day, academics published and carved out careers, the media found something else to gossip about, and politicians another voter concern. What went wrong?

To begin with, airline safety is about much more than a checklist. Pilots and crew report near misses to learn from, the co-pilot, not the captain flies the plane, you get aptitude tested to see if you have what it takes to be a commercial pilot, pilots spend hours training for unexpected events on simulators, commercial pilots normally fly only one type of airplane, working hours are limited, designers work closely with pilots to design safe airplanes and cockpits, and planes don’t work without efficient properly maintained runways, airports, and properly trained ground control, air control, etc. to support them.

Let’s compare that to surgery. We never learn from our near misses and hide them less lawyers descend on us and reduce our livelihoods to ashes, there is no co-surgeon doing the surgery, anyone can become a surgeon but not everyone has the aptitude for it, other than in theory, surgeons are not prepared for all eventualities, surgeons are expected to operate on many different conditions from simple to super complex, operating rooms and teams change all the time, surgeons work long hours without time off, hospitals and technology are designed for surgeons with little or no input from them, and no-one pays attention to everything needed to make surgery safe in a coordinated way; properly designed and maintained hospitals, equipment, properly trained staff and colleagues for support, etc. As it turns out, surgery is much more like flying private and small airplanes, which account for more than 90% of aviation crashes and fatalities, which, just like surgery, was not reduced by checklists.

A second and bigger problem is the implementation problem. The likelihood of a change initiative being successfully implemented is 5-15%, and the checklist initiative tripped over the same hurdle. To the best of my knowledge, no-one in business science ever researched this fact to find out why that is and what can be done to improve the odds. Instead, managers and leaders are armed repeatedly with degrees teaching the same non-working models, and keep using them despite their ongoing failure. If the outcomes of management decisions and change initiatives were held to the same standard of evidence as medical procedures, none would be registerable with the FDA. The checklist initiative would not have made it off first base, and thousands of dollars would have been saved. Just because you strongly believe something works or is useful doesn’t mean it is, and does.

If the issue is improving quality and reducing mistakes, as the Institute of Medicine and WHO suggested, you deal with a problem system consisting of many interacting components such as in the diagram below. The idea that a simple checklist based on little evidence and implemented poorly will improve such a system is ludicrous. A different approach is needed which leaders, managers, academics, consultants, planners, politicians, and bureaucrats refuse to acknowledge. Which is why, if we assume for a moment the Institute of Medicine data is correct, we are no safer today than 22 years ago.

To improve patient safety and quality, we must admit health care is complex and any change must involve people on the ground with the knowledge and insight from which solutions may emerge. We must improve cooperation, learn to adapt rather than control, learn from our actions and mistakes, and above all, engage in ongoing conversation and dialogue rather than top-down communication. Bullying with mandates creates resistance, or at best compliance, but not cooperation.

The mantra, or ideology, of all registering authorities in Canada is protecting patients. Yet these same organizations do nothing proactively to do so. Instead, they wait for harm to be done and then punish the perceived perpetrator in a highly legalistic way. If patients don’t complain, no harm was done, even if there was. It seems, to them, patient safety comes from scaring physicians with the threat of punishment. The notion patient safety is a complex undertaking and understanding that an opportunity exists to protect patients before harm is done doesn’t come into the equation, and, in my experience, is undiscussable.

The belief system on which health care, and leading and managing health care is based, is spent, and must change. Social complexity indicates it will be difficult because it threatens the self-image, status, and income of people in high positions. The railway tycoons of 100 years ago couldn’t believe air travel would shrink their industry and many were bankrupted. The day will come when the paradigm in health care shifts as well no matter how long and hard we resist it.

Postscript

On December 19, 2022, extreme cold weather and snowstorms hit Canada and North America, resulting in chaos at airports as flights were delayed and cancelled. Many travelers were unable to continue to their hot holidays, and others were unable to spend Christmas with their families. It was the equivalent of a COVID-19 pandemic.

Just like health care knew a pandemic would hit at some point but nevertheless didn’t prepare for it, airlines know weather is a problem in this part of the world at this time of the year and don’t prepare for it. In both instances, reasons for this can be found in the hidden social complexity, and both industries ignore the complexity they deal with by pretending it doesn’t exist. For most people a pandemic and travel disruptions are an inconvenience, but in health care it also becomes a safety issue and people die. The only way out is acknowledging the social, biological and technological complexity and deal with it appropriately. I use air transport as an example, because, after all, some would like to see physicians as pilots.