EC-301i · Module 2

The Data Challenge

4 min read

The data challenge is the most common executive objection to AI presentations. Your methodology is questioned. Your numbers do not match the ones they have from a different source. Your sample is called insufficient. These challenges arrive with the frequency of a pattern because AI metrics are genuinely uncertain, genuinely early-stage, and genuinely contested in many organizations. The data challenge is not an unfair attack — it is a reasonable response to a real uncertainty that you must handle without being defensive or capitulating.

  1. Explain the methodology before defending the conclusion The executive who challenges the data needs to understand how it was produced before they can evaluate whether the conclusion is valid. Walk through the methodology specifically: what was measured, how, over what period, with what controls. The methodology explanation is not defensive — it is the evidence that the number is real. A number without a methodology is an assertion. A number with a methodology is a finding.
  2. Acknowledge where uncertainty genuinely exists The AI dataset has real uncertainty. State it explicitly and specifically. "The 847-claim sample produces a 95% confidence interval of $3.90 to $4.50 per claim — the point estimate is $4.20. Even at the high end of the interval, the deployment economics are positive." Acknowledging the uncertainty specifically demonstrates that you understand the data, not that you are defending a weak position.
  3. Invite the counter-data When the executive says "that doesn't match our numbers," ask for their numbers. "I would like to understand the difference — can you share the data you are working with? We may be measuring different things, or there may be a definitional difference that I can reconcile." The offer to compare datasets is credibility-building: it signals that you are confident the methodology will hold up to comparison.
  4. Distinguish the challenge: methodology vs. conclusion Is the executive challenging the methodology (how you measured), the data (the numbers themselves), or the conclusion (what the data means)? These are different challenges with different responses. A methodology challenge requires a methodology response. A conclusion challenge requires an evidence response — more data, not better process. Diagnosing which challenge you are receiving prevents the waste of responding to the wrong one.