You know less than you think you do

(and that's a good thing)

You know less than you think you do.

Hard statement, I know.

But it’s true—we humans greatly overestimate how much we actually know.

Take this fascinating experiment: Researchers asked people if they understood how bicycles worked. Most confidently said yes. Then, they were asked to draw a bike—including all the mechanisms needed to make it function. 1

Not so easy.

This experiment has been replicated numerous times with different examples—how toilets work, drawing a detailed eukaryotic cell or DNA, explaining basic economic principles. Each time, the results are the same: we think we understand something, but when pressed, we realize we don’t.

This phenomenon has a name: the Dunning-Kruger effect. 2

The Illusion of Knowledge

One of the most fascinating aspects of the Dunning-Kruger effect is that it’s even more pronounced in people with just a little bit of knowledge. Those who know the least often feel the most certain.

Studies on the psychology of belief show that low-information individuals are more likely to feel assured in their opinions simply because they don’t recognize their own knowledge gaps. 3 It’s not about intelligence—it’s about how the brain works. Instead of engaging deeply, we mistake surface-level familiarity for true expertise.

And then comes social media.

We read a few articles, watch a couple of YouTube videos, and suddenly…we’re experts.

  • During the COVID-19 pandemic, we saw ordinary people suddenly become armchair epidemiologists.

  • After natural disasters, there is a surge of self-proclaimed meteorologists and government efficiency experts.

  • During political upheaval, we all became political scientists and constitutional scholars.

We consume bite-sized content, hear confident voices, and start to believe we know something. The problem? True expertise requires deep study, self-doubt, and an awareness of complexity.

The Confidence Trap

Think about how quickly someone can go from watching a single documentary or reading a handful of blogs to thinking they’ve done their research”—sometimes believing they know more than actual experts who have spent decades in the field.

  • A climate scientist understands how much there is still to explore and remains open to nuance.

  • A conspiracy theorist skimming blog posts may feel more certain than the scientist—despite knowing significantly less.

The less expertise someone has, the less they recognize what real expertise looks like. It’s not stupidity—it’s human nature.

The Dangerous Consequences of False Certainty

I joke about this a little, but the effects are serious—and sometimes deadly.

Disinformation spreads rapidly. Conspiracy theories and even cults thrive on people feeling sure about things that aren’t true. And in today’s world, there are plenty of people misrepresenting themselves as experts. Many present opinions as facts—and we believe them.

Because that’s what our brains do.

Until we choose not to.

How to Protect Yourself from the Dunning-Kruger Effect (and other cognitive Biases)

So, what can we do? How do we make sure we aren’t falling into this trap ourselves?

We start by practicing intellectual humility:

Vet your sources. When evaluating information, ask yourself:

  • Who is the expert? What are their credentials? Do they have decades of research—or just a microphone?

  • Do they have a vested interest in pushing a narrative? Are they presenting facts or opinions?

  • Do they acknowledge the limitations of their data? Science is built on uncertainty and refinement—real experts admit when they don’t know everything.

  • Are they open to changing their mind when better evidence appears? Or are they rigidly attached to a belief?

  • Are they trying to sell you something? If someone profits from a particular viewpoint, that’s a red flag.

(Want a whole list of red flags? Go here.)

Question your own assumptions. Ask yourself:

  • Could I be wrong?

  • What evidence would change my mind?

  • Am I considering multiple perspectives?

Embrace complexity.

The most dangerous phrase is: “It’s simple.” If a problem had an easy answer, experts would have solved it already.

Stay curious.

The moment we believe we have all the answers, we stop learning. And when we stop learning, we become even more vulnerable to misinformation.

In a world overflowing with information, the ability to separate fact from fiction has never been more critical. Misinformation isn’t just annoying—it’s dangerous. It fuels conspiracy theories, erodes trust in experts, and influences real-world decisions on health, politics, and society. With social media amplifying bad actors and false claims, we must sharpen our ability to question, verify, and think critically. If we don’t, we risk falling into the confidence trap—believing we know more than we do while being led by those who exploit certainty for power, profit, or control.

Because in the end, true wisdom isn’t about knowing everything—it’s about understanding how much you don’t know.

Thanks for reading The REAL™ | GritandGrace! This post is public so feel free to share it. Share

Leave a comment

1 https://pmc.ncbi.nlm.nih.gov/articles/PMC3062901/

2 https://www.scientificamerican.com/article/the-dunning-kruger-effect-isnt-what-you-think-it-is/

3 https://pubmed.ncbi.nlm.nih.gov/10626367/

Previous
Previous

What Really Makes Us Happy?

Next
Next

The Conspirituality Delusion