Overconfidence In Judgment And Decision Making

By Gerrit Van Wyk.

The Dunning-Kruger effect.

The Dunning-Kruger effect says people who lack competence in a particular area overestimate their abilities, and, some say, high performers underestimate theirs. The problem is overconfidence affects judgment and decision making, often with disastrous consequences.

Simone Lackner and co-workers published a paper in Nature Human Behaviour this week, which shows confidence grows quicker than knowledge, peaking at an average level of knowledge, and confident people with average knowledge tend to discount scientific evidence, which, as we discovered with the COVID pandemic, can result in the spread of false information and conspiracy theories.

This is not a trivial matter. We live in an increasingly complex world, and, as Dietrich Dörner argued, we disconnect complex interconnected systems into little pieces we can manipulate. Combined with the North American action bias and overconfidence fueled by knowing just enough to light the keg of dynamite, we stagger from self-induced disaster to disaster, without stopping for a moment to consider we are the agents of our own misfortune. As Pogo said, we have met the enemy and he is us.

Small mistakes in complex systems add up to become big ones, or messes as Ackoff called them, and it takes a while to see it happening, by which time things are out of control. Every time we make a mistake and try to correct it thereafter, the treadmill speeds up.

Instead of taking a timeout to think about what is going on and gathering information, most of us make more and more decisions with growing confidence until disaster strikes, which we blame on everyone and everything but ourselves.

It’s concerning when one reads blogs, articles, newspaper comments, reports, etc., how confident the authors are they are right, and how thin or even non-existent the evidence is supporting their opinions and beliefs. Which obliquely proves Dunning, Kruger and Lackner and co-workers right. A little knowledge dangerously grows into overconfidence. More concerning is that the same applies to many policies, plans, projects, etc.

The fact is, if our world, and particularly social world is complex, and I argue it is, then one can only know a fraction about anything of interest in it, and even experts can’t know everything. But one may add to that knowledge by engaging others, who know different things, in conversation and dialogue. The problem is, particularly here in North America, our social world is becoming more and more fractured along rigid lines and positions of intolerance, based on dogmatic ideologies and perspectives closed to alternative perspectives and opinions.

Sukenik and colleagues showed the Big Five trait of agreeableness has a downside; there is an association between agreeableness and overconfidence. Because of their positive self-image, agreeable people project more confidence in their knowledge and skills than they have, which benefits their social status. In other words, being likeable and confident is not the same as being competent.

The implication is people who are likeable and confident because they believe in themselves, even though that may be a mirage based on average knowledge and insight, are more likely to have their plans and projects accepted, at the expense of others with more knowledge and insight and therefore less confident they know and can solve everything.

That is the foundation of the consulting game; turn up with a 4×4 or colorful PowerPoint, project confidence, and the job is yours. Those who make the hiring decision already know they don’t have the answers, and, in our complex world, are uncertain they even know what the problem is. Given their uncertainty, no wonder that looks attractive. But, overconfidence and a lack of knowledge and understanding is a recipe for disaster when dealing with a complex problem, which means the whole cycle starts again, just from a worse starting point.

We like to be around likeable confident people, admire them, hire them, and vote for them, without for a moment considering that confidence may not stand on a firm foundation. Evidence shows we see ourselves as smarter, prettier, more competent, etc., than we really are. One of the ironies of depression is it strips away that veneer, which means we see ourselves exactly as we really are, which is, well, depressing.

What we really are is slow conscious thinkers who can only process a small amount of information at a time, added to that is our tendency to protect self-image and project competence, we don’t remember and retrieve exact replicas of events, and we evolved to focus on problems in front of us, which ill equips us for dealing with the complex issues of our time. Simplicity creates an illusion of competence and control, and the belief what we ignore doesn’t exist.

Paying attention to complexity requires ongoing incremental decisions, testing theories, asking “why” rather than “how” questions, paying attention to interconnections, and reflecting on how we behave and how that behavior affects what we interact with. It also requires tolerating uncertainty, and having experience is an advantage. Fact is, complex entities are interconnected, they are opaque, and cannot be controlled, and we must understand that.

Experienced people like firefighters, healthcare professionals, soldiers, etc., who make decisions in high-risk environments with little information, no clear goals, and without obvious procedures to rely on, use intuition, visualizing outcomes, finding similarities between similar situations, and reducing the situation into a narrative, or story. In other words, they learned to see and experience a complex world differently, which makes them sensitive to interconnections, anticipating events, awareness of mismatches, and patterns. They make predictions about how things may unfold, include the perspectives of others, and notice differences and discrepancies others don’t see. Echoing Dörner’s research about decision making in complex situations, they are introspective, sensitive to their limitations, see the big picture, are self-critical, and easily adapt. Note how that is the opposite of confident and overconfident people.

In conclusion, we are not very good at making decisions in complex situations, because we are too confident, ignore complexity, and are influenced by our social biases. We evolved to effectively deal with day-to-day issues by simplifying them, but at the expense of the complex reality we live in and very complex social world we created. Consequently, the way we make decisions and plan not only sets us up for failure, it also often makes things worse. There are people who through experience, learning, or disposition are better able to manage complex problems, but that knowledge makes them outsiders, which, for social reasons, means we exclude them and ignore the contribution they can make. Instead, we fete and pay for overconfident people promising simple solutions to complex problems, and are disappointed when, predictably, they fail and we end up in a bigger hole.

Carlo Rovelli said about quantum mechanics, “my effort here is not to modify quantum mechanics to make it consistent with my view of the world, but to modify my view of the world to make it consistent with quantum mechanics”. It’s time we stop trying to modify the big, wicked, complex problems we are faced with by trying to modify reality to fit our view, and change our view to fit reality. Stubborn overconfidence based on the wrong knowledge is not the solution.