Skip to main content

Newsletters

Using Data to Outsmart Your Overconfidence

Shaun Davis
AuthorShaun Davis




We all overestimate. Data helps us plan smarter and deliver more.

Humans are notoriously bad at predicting the future. We overestimate our abilities, underestimate the time things take, and assume future-us will somehow be faster, sharper, and better organized than present-us. At Action, we get a reminder of that truth every year.

For one week a year, we take a week off from our billable work to focus on Action itself. It’s a week where we work on all the things we say we’ll get to, but never have time for. We write, connect, plan, and build.

On Monday, we set intentions for what we plan to accomplish. By Friday, every one of us had failed to accomplish all we had intended.

That’s not failure, that’s feedback.

Reflecting on it since then, I’ve been thinking of how we might have done things better. It’s a universal human experience that we’re far less capable than we imagine. Without any training, we believe that, in a pinch, we can land planes, run a multinational corporation, perform surgery, or even work a fast food drive-through window. This is so prevalent there’s a cognitive bias named for it: The Dunning-Kruger Effect.

“The Dunning–Kruger effect is a cognitive bias that describes the systematic tendency of people with low ability in a specific area to give overly positive assessments of this ability.”

The same is true when we estimate or try to predict the future. One of my tenets is that humans are terrible at predicting the future. The upside: if we’re this bad at it, even small improvements can make a huge difference.

So, how can we use data to better plan?

Here are two potential routes up this mountain:

  • Using data we already have
  • Measuring as you go

Using Data We Already Have

Time-Tracking Data

I’ve been a consistent time tracker at work for over 10 years now. I use TogglTrack to keep track of where my time goes. How I wish I’d kept all 10 years of that data.

Before I made the jump to consulting, it helped me understand how fast I did or didn’t work when creating a dashboard. The time tracking data helps me create more accurate proposals for clients.

But I recognize that time tracking to the minute is exceedingly rare.

So what other sources can you use to plan?

  • Elapsed Time: Most teams use some form of work tracking, like Jira, Azure DevOps or some other tool, to plan and prioritize work. Pulling the elapsed time in work days from work start to work end is a helpful proxy for effort.
  • Work Accomplished Per Sprint: Agile or some variation of it is the default operating system for many teams. Looking at what features were released by sprint can be used to estimate the velocity of work.
  • Surveys: Don’t do this! If you have experienced how bad people are at estimating the work needed to accomplish a task, the numbers you’ll get back will be meaningless.

Measuring As You Go

Instead of trying to estimate everything upfront, measure as you go. Track real progress against actual outcomes. This works because it replaces prediction with observation.

You have a choice to make:

  • Fixed scope, but flexible delivery date
  • Flexible scope, but fixed delivery date

In either case, measure consistently at set intervals.

Fixed Scope

  • Phase Gate Reviews: Formal checkpoints tied to deliverable maturity: requirements complete, design complete, build complete, test complete, deploy complete. Best for sequential or waterfall-style data initiatives (governance rollouts, platform migrations).
  • Percent Complete by Work Package: Agile-lean hybrids sometimes track:

    0% (start) → 20% (setup done) → 40% (core logic built) → 60% (validated) → 80% (UAT passed) → 100% (delivered).

    Best for analytics engineering or BI where work happens in defined chunks.

Fixed Delivery

  • Time-Boxed Cadence: Scrum, Kanban, and Shape Up use consistent rhythms: daily standups, weekly demos, biweekly sprint reviews. Best for iterative or ML feature delivery.
  • Fibonacci/Logarithmic Checkpoints: Used in lean/agile forecasting to capture nonlinear progress perception: 1 → 2 → 3 → 5 → 8 → 13 days/weeks. Acknowledges that uncertainty grows with project length. Best for discovery-heavy work (e.g., model development).
  • OKR-Style Milestone Burndown: Measure progress as movement toward objective key results: Start (0%), confidence updates (biweekly), midpoint review, final assessment. You’re not tracking “% complete,” you’re tracking confidence trendlines. Best for outcome-driven analytics projects.

Methods That Work for Both

  • Event-Based Checkpoints: Trigger reviews when meaningful events occur — first dataset ingested, first dashboard draft, first stakeholder signoff, first deployment. These reflect natural “inflection moments” in analytics work, regardless of duration.
  • Value Realization Milestones: Measure when business value becomes visible: data live → metric visible → decision made → outcome realized. Each checkpoint indicates a deeper level of adoption and ROI. Best for analytics transformation programs where outcomes lag delivery.

Getting Work Done

At the end of the day, don’t get stuck in analysis loops. It’s easy to spend more time optimizing the system than actually using it.

We’ll never be perfect at predicting. But we can get better at observing… and that’s what turns plans into performance.

Until next time,
Shaun

***

Subscribe to Analytics Advantage

***

Shaun Davis, your personal data therapist, understands your unique challenges and helps you navigate through the data maze. With keen insight, he discerns the signal from the noise, tenaciously finding the right solutions to guide you through the ever-growing data landscape. Shaun has partnered for 10 years with top data teams to turn their data into profitable and efficiency hunting action. Learn more about Shaun.