The English philosopher Bertrand Russell once observed, “The whole problem with the world is that fools and fanatics are so sure of themselves, while wiser people are so full of doubt.” This astute observation encapsulates a psychological phenomenon now widely known as the Dunning-Kruger Effect.
Named after psychologists David Dunning and Justin Kruger, who first described it in 1999, this cognitive bias explains why individuals with low ability at a task often overestimate their competence, while those with high ability tend to underestimate theirs. Or in layman’s terms, stupid people do not know that they are stupid.
The Dunning-Kruger Effect manifests in various real-world scenarios. For instance, elderly drivers who believe they are superior behind the wheel are four times more likely to make unsafe driving errors. Gun owners who consider themselves highly knowledgeable about gun safety often score the lowest on gun safety tests.
Similarly, medical lab workers who rate themselves as highly competent are frequently the least skilled in their roles. Even in academic settings, the lowest-performing college students dramatically overestimate their exam performance, while the least successful debaters wildly overestimate their debating skills.
This phenomenon isn’t limited to specific skills or professions. People with unhealthy lifestyle habits often rate themselves as far healthier than they actually are. Those who perform poorly on cognitive reasoning and analytical thinking tests severely overestimate their intellectual abilities. But why does this happen? To understand, we must delve into the nature of knowledge and self-awareness.
The four quadrants of knowledge
Knowledge can be divided into four quadrants:
1. Known knowns: Things we know that we know. For example, “I know how to ride a bike.”
2. Known unknowns: Things we know that we don’t know. For example, “I have no idea how quantum physics works.”
3. Unknown knowns: Things we’ve forgotten we know or don’t realise we know. For example, you might still remember how to drive to your childhood home but have forgotten that you know this.
4. Unknown unknowns: Things we don’t know that we don’t know.
When we are novices in a field, we are acutely aware of the little we know (known knowns) but remain oblivious to the vast expanse of what we don’t know (unknown unknowns). For example, someone who knows nothing about football might think the sport is simple—just kicking a ball into a net. However, as they learn more, they discover the nuances: shooting mechanics, defensive strategies, and various types of shots. This growing awareness of what they don’t know creates doubt and humility.
As expertise grows, much of what an expert knows becomes automatic and unconscious. For instance, a seasoned football player no longer thinks about their shooting form or free-kick technique—it’s second nature. This unconscious competence means experts often forget how much they know, leading them to underestimate their abilities. Meanwhile, novices, unaware of their gaps in knowledge, overestimate their competence.
The knowledge circle
Another way to visualise this is to think of knowledge as a circle. The area inside the circle represents what you know, and the border represents the horizon of your knowledge—everything you’re aware of but don’t yet understand. As your circle of knowledge grows, so does the horizon of your awareness. The more you know, the more you realise how much you don’t know.
Interestingly, as knowledge becomes automatic, a second, smaller circle forms within the first. This inner circle represents what you’ve forgotten you know—knowledge so ingrained it feels obvious. This dual-circle model explains why experts often feel uncertain: they are acutely aware of the vastness of their unknown unknowns, while much of their knowledge is unconscious.
The paradox of overcoming the Dunning-Kruger Effect
Learning about the Dunning-Kruger Effect often leads people to think, “Thank goodness I’m aware of this bias—I must be immune to it.” Ironically, this very thought exemplifies the effect. Research shows that simply being aware of cognitive biases doesn’t make us less susceptible to them. This is because blind spots, by definition, are invisible to us.
The comfort of certainty plays a significant role in this paradox. Humans dislike uncertainty and often settle on beliefs—true or not—to alleviate anxiety. This tendency to default to certainty, even in the absence of evidence, reinforces overconfidence in the incompetent and doubt in the competent.

MasterClass: Learn from the best
LEARN FROM THE WORLD’S BEST The greatest have something to teach us all—at any level. Watch world-class instructors share their stories, skills, …
The challenge of changing minds
Attempting to correct others’ misconceptions often backfires. When people’s beliefs are challenged, they don’t typically change their minds; instead, they become more defensive. This is because beliefs are often tied to identity and group affiliation. When confronted with contradictory evidence, the response isn’t, “I need to update my assumptions,” but rather, “I’m under attack.”
This dynamic is evident in online arguments, where logical arguments and data are often dismissed. People’s beliefs are less about logic and more about their sense of belonging to a tribe. This makes changing minds an uphill battle.
Cultivating humility
So, what can we do? For ourselves, the key is to cultivate humility. This means holding fewer opinions or at least holding them less strongly. It involves being less emotionally attached to our beliefs and more open to the possibility of being wrong. When faced with something upsetting or angering, instead of jumping to conclusions, we can pause and say, “I don’t know.”
Humility is particularly important in an age where false confidence is often rewarded. On the internet, boldness, zealotry, and fanaticism garner attention and validation. However, life is complex, and most of us don’t truly know what we’re doing most of the time. Acknowledging this can open us up to learning and growth while preventing us from becoming narcissistic people online.
Planting seeds, not changing minds
When it comes to others, it’s essential to recognise that we can’t force people to change their minds. Instead, we can plant seeds—ideas or arguments that may take root over time. Profound truths often do not sink in immediately; they require the right environment or life experiences to resonate.
This approach requires patience and an understanding that meaningful change often takes time. Just as ideas planted in our youth may only make sense decades later, the seeds we plant in others may take years to sprout.
The practicality of humility
Ultimately, the Dunning-Kruger Effect teaches us that humility is not just a virtue but a practical tool for navigating life. By intentionally underestimating our understanding, we remain open to learning and growth. Humility allows us to approach the world with curiosity rather than arrogance, fostering deeper connections and more meaningful progress.
In a world that often rewards bluster and bravado, humility stands as a quiet but powerful antidote to the pitfalls of overconfidence. As we strive to understand ourselves and others, embracing the limits of our knowledge may be the wisest step of all.
And if you’re now thinking, “Wow, I’m so humble,” remember: that’s probably the Dunning-Kruger Effect talking.
_________________________

Every month in 2025 we will be giving away one PlayStation 5 Pro. To qualify subscribe to our newsletter.
When you buy something through our retail links, we may earn commission and the retailer may receive certain auditable data for accounting purposes.
Recent Articles
- The Dunning-Kruger Effect: Why stupid people think they’re smart
- The best ways to boost your mood naturally
- Day 14 of Lent: The history of Lenten sacrifices – why we give things up
- Day 18 of Ramadan: How to avoid overeating at iftar – healthy habits for a balanced fast
- Navigating market volatility: A strategic guide for investors amidst uncertainty
You may also like: