The Confidence Gap

We love data. We ask for it. We fund it. We build entire decks around it.

We say things like: “Let’s make a data-driven decision.”

And then the data comes back.

And suddenly, we have questions. Lots of questions.


The Version of Data We Think We Want

In theory, data is there to:

  • Challenge assumptions
  • Reduce risk
  • Guide decisions


It’s objective. Rational. Grounded.

It tells us what’s actually happening, not what we hope is happening.

Beautiful stuff!


The Version of Data We Actually Want

In practice?

We want data that says: “You’re right” “This will work” “Proceed confidently”

We want reinforcement. Not resistance.

Because there’s a big difference between: Using data to decide

And: Using data to confirm


The Moment Everything Changes

You’ve seen it. A team has a direction. A strategy. A strong point of view.

Research is commissioned. Everyone’s excited.

Until the results come back and say: “…not quite.”

And suddenly:

  • The sample size is questioned
  • The methodology is debated
  • Someone asks if we can “look at it another way”


Which is a polite way of saying: “Can we find a version of this where we’re still right?”


Decision-Making vs Validation

This is the confidence gap. We say we want data to make better decisions.

But often, we’re really using it to:

  • Validate what we already believe
  • De-risk decisions we’ve already made
  • Build a case for something that’s already in motion


Research becomes:

  • A safety net
  • A justification tool
  • A very expensive agreement generator


Instead of what it’s supposed to be: A truth-teller


Why This Happens (It’s Not Just Ego)

To be fair, this isn’t just about stubbornness.

It’s about:

  • Time pressure
  • Stakeholder expectations
  • Emotional investment in ideas
  • The very real cost of being wrong


Because data doesn’t just inform decisions. It challenges identity. And that’s a harder thing to navigate.


What Real Confidence Looks Like

Real confidence isn’t: “We know this will work”

It’s: “We’re willing to find out if it won’t”

That’s a very different posture.

It means:

  • Asking better questions
  • Designing research to test, not prove
  • Being open to uncomfortable outcomes


Because the value of research isn’t in agreement. It’s in clarity.


The Risk of Getting This Wrong

When we only accept data that confirms our thinking, we don’t reduce risk. We delay it.

Because the market will eventually answer the question for us. And it’s much less polite than a research report.


The Last Word

Everyone wants data. Until it disagrees or doesn’t fit neatly into a narrative.

And that’s the moment that matters. Because research isn’t there to make us feel confident. It’s there to make us correct.

So next time you’re looking at a set of results that challenge your thinking, it might be worth asking: Are we trying to understand this, or just trying to survive it.


***


SMARI is an award-winning Indiana-based market research consultancy that was founded in 1983 with the idea of guiding change and inspiring confidence. We are proud to work with SMEs as well as a variety of Fortune 500 brands. We are powered by our core values: integrity, community, perseverance, trust, passion, curiosity, and innovation. SMARI’s expertise includes full project scopes, including instrument design, sampling & fielding services, reporting & analysis in Healthcare, CPG, Retail, Food & Beverage, Manufacturing, Financial Services industries, and beyond. Much has changed in our 40+ years, but our tagline and overarching mission remain the same—to guide change and inspire confidence. Start a conversation with us at www.SMARI.com.

share