Decisions When Things Are Complex

By Gerrit Van Wyk.

Shoot! Aim! Ready!

Kahneman’s type 1 and 2 thinking, in terms of which we developed simple rules to make decisions quickly but must knuckle down and think and reason when they don’t work, is a popular model. We can get away with type 1 thinking when things are simple, but as Dörner’s research showed, we cope poorly when thinking about complex interconnected systems, health care for example. Because we default to the clockwork model, we disconnect variables to manipulate them, which is compounded by the action orientation dominating in North America where we shoot before we aim. If you act before thinking things through in complex situations, the bear will get you no matter how big your gun.

Experts protect their self-image, hence fall in the trap of the Dunning-Kruger rule; they believe they know more than they do. They focus what is in front of them, ignoring the bigger picture for which it is better to adapt to change, rather than trying to plan for it.

Small mistakes in complex systems add up to create big problems which often cannot be undone, and it takes time before outcomes become apparent. Typically, we spend little time reflecting on what is going on and gathering feedback when confronted with complexity, and more on making decisions with growing confidence, until disaster strikes after which we blame the problem, not ourselves. As intricate and complex as the human brain is, the mind is a slower information processor than a computer, we are capable of only processing small amounts at one time, we focus on what’s in front of us, and we protect our self-image and sense of competence which compounds the problem. Simplicity creates the illusion of competence and being in control and the belief what we ignore doesn’t exist.

People who are good at making complex decisions have a different skill-set. They try to take as much as possible into account, make incremental decisions, ask “why” questions, not “how”, test their theories, and pay attention to interconnections. They reflect on what they do, and modify their behavior if needed, and take a structured approach. They are not smarter, nor do they have a special personality type, but are better at tolerating uncertainty, and are often older and more experienced.

Systems are complex when they consist of many interconnected parts with many interactions, they are opaque, and can’t be controlled. Most people trying to problem solve them don’t understand them and make incorrect assumptions about what they are and how they work, while people who do understand them are mostly considered as outsiders and ignored.

People like firefighters, soldiers, physicians, and nurses, make decisions in high-risk environments with little information, unclear goals, and without defined procedures, but, as Klein showed, they can visualize outcomes, find patterns, and can reduce that to a story. They see and experience the world differently. They see interconnections others don’t, anticipate possible outcomes and notice if there is a mismatch, have an overall sense of what is happening, see patterns and can make predictions about them, see the world through the eyes of others, and notice differences and discrepancies others can’t. Like expert complex decision makers, they are introspective and can sense their limitations, see the big picture, learn from their mistakes, and are adaptable.

Sadly, in health care, like many industries, the knowledge and capabilities of expert complex decision makers and experienced professionals are not valued. Politicians, planners, consultants, bureaucrats, and others lacking those skills mess with complexity by ignoring it. They focus on the how’s and never ask what is going on here, how do we understand this, or why this is the case. If they hear a rustle, they go on fully automatic fire before finding out if it is a bear. We stumble into solutions like managed health care, quality management, efficiency, lean, single health regions, etc., with no understanding of what we deal with and why we do this. When these fail, we blame the plans and formulas rather than look at ourselves.

Health care is biologically, technologically, and socially the most complex of all industries, and those three complexities interact. We can no longer continue problem solving and planning for it pretending it is simple, as we do. We need to include those who are good at unpacking complex systems, not exclude them any longer if we hope to change health care meaningfully.

Like Shakespeare’s Cassius said, the fault dear Brutus, is not in our stars, but in ourselves that we are underlings. The blame for health care’s mess points squarely at the decision makers doggedly refusing their method is bankrupt and to consider alternatives, including involving experienced professionals at the coalface in planning. Meaningful change comes from the fringes, not the immovable center. Nothing will move until the center changes.